Dec 01 09:59:11 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 09:59:11 crc restorecon[4744]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:59:11 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:59:12 crc restorecon[4744]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 09:59:13 crc kubenswrapper[4958]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:59:13 crc kubenswrapper[4958]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 09:59:13 crc kubenswrapper[4958]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:59:13 crc kubenswrapper[4958]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:59:13 crc kubenswrapper[4958]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 09:59:13 crc kubenswrapper[4958]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.503961 4958 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507188 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507215 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507220 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507226 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507231 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507236 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507241 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507246 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507250 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507254 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507260 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507267 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507274 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507279 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507285 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507290 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507300 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507325 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507332 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507337 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507341 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507346 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507351 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507356 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507360 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507365 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507369 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507374 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507379 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507384 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507388 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507392 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507397 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507402 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507407 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507413 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507417 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507422 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507427 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507432 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507437 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507444 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507451 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507459 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507467 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507472 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507477 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507483 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507490 4958 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507495 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507500 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507504 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507509 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507513 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507519 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507525 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507529 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507534 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507539 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507543 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507547 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507552 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507558 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507563 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507568 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507587 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507592 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507619 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507624 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507629 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.507633 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507784 4958 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507800 4958 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507809 4958 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507818 4958 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507825 4958 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507832 4958 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507862 4958 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507870 4958 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507876 4958 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507886 4958 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507892 4958 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507898 4958 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507904 4958 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507909 4958 flags.go:64] FLAG: --cgroup-root="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507914 4958 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507919 4958 flags.go:64] FLAG: --client-ca-file="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507924 4958 flags.go:64] FLAG: --cloud-config="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507930 4958 flags.go:64] FLAG: --cloud-provider="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507935 4958 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507943 4958 flags.go:64] FLAG: --cluster-domain="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507948 4958 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507953 4958 flags.go:64] FLAG: --config-dir="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507959 4958 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507965 4958 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507972 4958 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507978 4958 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507983 4958 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507989 4958 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.507994 4958 flags.go:64] FLAG: --contention-profiling="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508000 4958 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508005 4958 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508011 4958 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508015 4958 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508021 4958 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508026 4958 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508030 4958 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508034 4958 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508039 4958 flags.go:64] FLAG: --enable-server="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508044 4958 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508055 4958 flags.go:64] FLAG: --event-burst="100" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508060 4958 flags.go:64] FLAG: --event-qps="50" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508065 4958 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508070 4958 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508074 4958 flags.go:64] FLAG: --eviction-hard="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508080 4958 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508098 4958 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508105 4958 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508112 4958 flags.go:64] FLAG: --eviction-soft="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508118 4958 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508123 4958 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508128 4958 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508133 4958 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508138 4958 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508143 4958 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508148 4958 flags.go:64] FLAG: --feature-gates="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508155 4958 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508160 4958 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508165 4958 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508170 4958 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508175 4958 flags.go:64] FLAG: --healthz-port="10248" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508179 4958 flags.go:64] FLAG: --help="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508184 4958 flags.go:64] FLAG: --hostname-override="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508189 4958 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508194 4958 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508199 4958 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508203 4958 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508210 4958 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508216 4958 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508220 4958 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508225 4958 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508229 4958 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508234 4958 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508239 4958 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508246 4958 flags.go:64] FLAG: --kube-reserved="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508251 4958 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508255 4958 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508260 4958 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508265 4958 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508269 4958 flags.go:64] FLAG: --lock-file="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508274 4958 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508306 4958 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508312 4958 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508320 4958 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508324 4958 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508329 4958 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508334 4958 flags.go:64] FLAG: --logging-format="text" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508338 4958 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508344 4958 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508348 4958 flags.go:64] FLAG: --manifest-url="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508353 4958 flags.go:64] FLAG: --manifest-url-header="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508374 4958 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508379 4958 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508385 4958 flags.go:64] FLAG: --max-pods="110" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508390 4958 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508395 4958 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508400 4958 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508405 4958 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508410 4958 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508416 4958 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508421 4958 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508435 4958 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508440 4958 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508445 4958 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508450 4958 flags.go:64] FLAG: --pod-cidr="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508455 4958 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508465 4958 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508470 4958 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508476 4958 flags.go:64] FLAG: --pods-per-core="0" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508481 4958 flags.go:64] FLAG: --port="10250" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508485 4958 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508490 4958 flags.go:64] FLAG: --provider-id="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508495 4958 flags.go:64] FLAG: --qos-reserved="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508500 4958 flags.go:64] FLAG: --read-only-port="10255" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508505 4958 flags.go:64] FLAG: --register-node="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508510 4958 flags.go:64] FLAG: --register-schedulable="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508514 4958 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508536 4958 flags.go:64] FLAG: --registry-burst="10" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508541 4958 flags.go:64] FLAG: --registry-qps="5" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508546 4958 flags.go:64] FLAG: --reserved-cpus="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508551 4958 flags.go:64] FLAG: --reserved-memory="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508558 4958 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508563 4958 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508568 4958 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508573 4958 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508577 4958 flags.go:64] FLAG: --runonce="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508582 4958 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508587 4958 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508591 4958 flags.go:64] FLAG: --seccomp-default="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508596 4958 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508601 4958 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508607 4958 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508612 4958 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508617 4958 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508622 4958 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508627 4958 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508632 4958 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508636 4958 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508641 4958 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508647 4958 flags.go:64] FLAG: --system-cgroups="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508652 4958 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508661 4958 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508665 4958 flags.go:64] FLAG: --tls-cert-file="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508670 4958 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508676 4958 flags.go:64] FLAG: --tls-min-version="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508681 4958 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508686 4958 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508691 4958 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508696 4958 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508701 4958 flags.go:64] FLAG: --v="2" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508708 4958 flags.go:64] FLAG: --version="false" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508715 4958 flags.go:64] FLAG: --vmodule="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508720 4958 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.508725 4958 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508856 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508862 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508866 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508871 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508875 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508879 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508885 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508889 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508895 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508900 4958 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508905 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508909 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508913 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508917 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508921 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508926 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508930 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508935 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508940 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508944 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508949 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508953 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508958 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508962 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508967 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508971 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508975 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508981 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508987 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508993 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.508998 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509003 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509034 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509040 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509046 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509059 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509095 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509106 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509112 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509118 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509123 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509127 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509132 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509137 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509142 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509147 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509151 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509156 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509161 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509166 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509171 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509176 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509182 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509187 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509192 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509197 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509202 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509208 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509215 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509220 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509225 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509230 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509234 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509239 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509244 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509248 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509252 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509258 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509262 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509268 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.509272 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.509282 4958 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.519098 4958 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.519142 4958 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519217 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519226 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519230 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519235 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519239 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519243 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519247 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519250 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519255 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519264 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519268 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519272 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519275 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519279 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519283 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519287 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519290 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519294 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519297 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519301 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519305 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519308 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519314 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519319 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519323 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519327 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519331 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519334 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519338 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519342 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519345 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519353 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519357 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519361 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519365 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519368 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519375 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519380 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519387 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519392 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519398 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519404 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519410 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519415 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519421 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519425 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519430 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519435 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519439 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519443 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519449 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519455 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519460 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519465 4958 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519470 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519475 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519481 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519485 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519491 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519496 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519500 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519504 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519508 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519515 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519519 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519523 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519528 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519532 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519536 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519540 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.519544 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.519551 4958 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520279 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520326 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520333 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520339 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520344 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520348 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520358 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520362 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520366 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520371 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520375 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520380 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520384 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520388 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520393 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520397 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520401 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520405 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520409 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520417 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520422 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520427 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520433 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520439 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520443 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520448 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520459 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520464 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520468 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520473 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520477 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520482 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520491 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520496 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520500 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520506 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520511 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520517 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520523 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520530 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520536 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520542 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520548 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520554 4958 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520561 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520566 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520570 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520574 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520579 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520584 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520588 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520593 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520597 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520604 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520609 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520613 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520618 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520627 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520633 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520639 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520644 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520648 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520652 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520657 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520661 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520665 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520670 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520675 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520680 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520691 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.520695 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.520702 4958 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.522783 4958 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.528560 4958 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.528784 4958 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.530052 4958 server.go:997] "Starting client certificate rotation" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.530139 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.530713 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 15:44:58.581241691 +0000 UTC Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.530869 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.550673 4958 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.555059 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.555312 4958 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.567763 4958 log.go:25] "Validated CRI v1 runtime API" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.598762 4958 log.go:25] "Validated CRI v1 image API" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.601088 4958 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.603813 4958 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-09-54-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.603883 4958 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.623613 4958 manager.go:217] Machine: {Timestamp:2025-12-01 09:59:13.62201985 +0000 UTC m=+1.130808897 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:dc41de36-78ed-40fc-8073-01c3beb6f3e3 BootID:150e7b6c-bfd8-4984-9394-004cd9e4353a Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bc:69:56 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bc:69:56 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:12:b4:4e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e1:44:94 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3e:46:1d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:60:4f:9b Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a3:12:95 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7a:22:f2:4e:1c:62 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:86:89:cb:03:9e:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.623977 4958 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.624246 4958 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.624704 4958 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.627665 4958 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.627746 4958 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.628136 4958 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.628156 4958 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.628503 4958 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.628572 4958 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.629274 4958 state_mem.go:36] "Initialized new in-memory state store" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.629427 4958 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.630859 4958 kubelet.go:418] "Attempting to sync node with API server" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.630896 4958 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.630950 4958 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.630984 4958 kubelet.go:324] "Adding apiserver pod source" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.631014 4958 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.633376 4958 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.634051 4958 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.635627 4958 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637135 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637169 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637182 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637195 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637213 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637224 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637238 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637258 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637270 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637283 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637295 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.637303 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.638412 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.638524 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.638757 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.638930 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.642137 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.642861 4958 server.go:1280] "Started kubelet" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.644882 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:13 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.646063 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.646150 4958 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.646229 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 14:36:22.298219304 +0000 UTC Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.646282 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1132h37m8.651941796s for next certificate rotation Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.654216 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.654674 4958 factory.go:55] Registering systemd factory Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.654194 4958 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655018 4958 factory.go:221] Registration of the systemd container factory successfully Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655409 4958 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655549 4958 factory.go:153] Registering CRI-O factory Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655571 4958 factory.go:221] Registration of the crio container factory successfully Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655630 4958 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655669 4958 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655737 4958 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655797 4958 factory.go:103] Registering Raw factory Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.655824 4958 manager.go:1196] Started watching for new ooms in manager Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.676198 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.676602 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.655550 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="200ms" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.645758 4958 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.679223 4958 manager.go:319] Starting recovery of all containers Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.679666 4958 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.680014 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.216:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d0f07718458c3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:59:13.642789059 +0000 UTC m=+1.151578096,LastTimestamp:2025-12-01 09:59:13.642789059 +0000 UTC m=+1.151578096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.688121 4958 server.go:460] "Adding debug handlers to kubelet server" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.695722 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696008 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696087 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696107 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696122 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696155 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696171 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696183 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696201 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696271 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696284 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696302 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696343 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696360 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696372 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696388 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696472 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696510 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696526 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696542 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696595 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696616 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696629 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696666 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696687 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696772 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696790 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696828 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696870 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696889 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696902 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696916 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696958 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696973 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.696985 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.697000 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.697899 4958 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.697934 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.697952 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698046 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698063 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698078 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698095 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698110 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698159 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698174 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698190 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698206 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698224 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698238 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698252 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698268 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698281 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698311 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698333 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698352 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698369 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698386 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698399 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698414 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698429 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698442 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698455 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698469 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698487 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698500 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698515 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698528 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698543 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698557 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698573 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698587 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698601 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698617 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698632 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698652 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698670 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698686 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698700 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698715 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698731 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698745 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698760 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698775 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698791 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698805 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698863 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698881 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698925 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698946 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698960 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698973 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.698988 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699002 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699016 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699031 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699048 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699062 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699077 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699091 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699104 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699117 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699130 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699145 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699161 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699183 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699200 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699216 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699231 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699246 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699263 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699280 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699297 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699311 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699326 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699340 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699355 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699370 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699383 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699396 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699409 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699425 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699439 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699456 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699470 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699484 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699508 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699522 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699535 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699550 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699565 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699581 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699594 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699610 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699624 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699638 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699656 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699672 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699691 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699708 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699723 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699737 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699753 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699767 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699783 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699798 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699875 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699890 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699906 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699922 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699938 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699954 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699969 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.699989 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700004 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700022 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700040 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700056 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700071 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700086 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700111 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700128 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700143 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700158 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700174 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700225 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700242 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700258 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700277 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700293 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700308 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700326 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700341 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700360 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700376 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700393 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700408 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700423 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700466 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700480 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700497 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700512 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700528 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700544 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700558 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700580 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700595 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700611 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700625 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700643 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700658 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700672 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700688 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700706 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700724 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700739 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700754 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700771 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700786 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700800 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700815 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700829 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700980 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.700998 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.701015 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.701030 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.701046 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.701061 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.701084 4958 reconstruct.go:97] "Volume reconstruction finished" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.701096 4958 reconciler.go:26] "Reconciler: start to sync state" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.707171 4958 manager.go:324] Recovery completed Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.721916 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.724864 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.724910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.724921 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.726142 4958 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.726157 4958 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.726185 4958 state_mem.go:36] "Initialized new in-memory state store" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.739389 4958 policy_none.go:49] "None policy: Start" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.755087 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.778883 4958 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.778958 4958 state_mem.go:35] "Initializing new in-memory state store" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.793514 4958 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.795964 4958 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.796067 4958 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.796162 4958 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.796264 4958 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 09:59:13 crc kubenswrapper[4958]: W1201 09:59:13.797691 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.797821 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.846373 4958 manager.go:334] "Starting Device Plugin manager" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.846556 4958 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.846585 4958 server.go:79] "Starting device plugin registration server" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.847155 4958 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.847185 4958 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.847400 4958 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.847738 4958 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.847747 4958 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.855475 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.879171 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="400ms" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.897238 4958 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.897490 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.899180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.899229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.899243 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.899525 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.899870 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.899934 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.900749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.900783 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.900794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.901226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.901245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.901253 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.901359 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.901663 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.901704 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902670 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902683 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902743 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902766 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902777 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.902923 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903039 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903077 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903679 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903828 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903916 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.903943 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.904437 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.904477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.904488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.905863 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.905905 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.905921 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.906023 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.906082 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.906102 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.906434 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.906488 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.907428 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.907460 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.907469 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.947719 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.949877 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.949929 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.949941 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:13 crc kubenswrapper[4958]: I1201 09:59:13.949968 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:13 crc kubenswrapper[4958]: E1201 09:59:13.950523 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.216:6443: connect: connection refused" node="crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003724 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003777 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003814 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003835 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003897 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003920 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.003979 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004018 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004120 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004194 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004225 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004245 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004260 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004274 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.004292 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105569 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105646 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105710 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105729 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105751 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105766 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105804 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105824 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105904 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105964 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105964 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106008 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106016 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106128 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106157 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106165 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106226 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106251 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.105935 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106278 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106294 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106327 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106366 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106395 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.106557 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.150758 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.152408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.152496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.152510 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.152555 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:14 crc kubenswrapper[4958]: E1201 09:59:14.153383 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.216:6443: connect: connection refused" node="crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.245034 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.266978 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: E1201 09:59:14.280502 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="800ms" Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.282391 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1836988b1570f3fb5aa15e456b70f3df018a8897a92d27c557abf8b23ed46327 WatchSource:0}: Error finding container 1836988b1570f3fb5aa15e456b70f3df018a8897a92d27c557abf8b23ed46327: Status 404 returned error can't find the container with id 1836988b1570f3fb5aa15e456b70f3df018a8897a92d27c557abf8b23ed46327 Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.289652 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.291291 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2c1e62569b87f24fd5fbf16fa569d8eabb570d3a38eee823a9bd897f8dab079c WatchSource:0}: Error finding container 2c1e62569b87f24fd5fbf16fa569d8eabb570d3a38eee823a9bd897f8dab079c: Status 404 returned error can't find the container with id 2c1e62569b87f24fd5fbf16fa569d8eabb570d3a38eee823a9bd897f8dab079c Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.306577 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.309631 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7bc7adcd32add3c65a7343e6a1845e1771c7b30dbd4fc7df5437c764963cf536 WatchSource:0}: Error finding container 7bc7adcd32add3c65a7343e6a1845e1771c7b30dbd4fc7df5437c764963cf536: Status 404 returned error can't find the container with id 7bc7adcd32add3c65a7343e6a1845e1771c7b30dbd4fc7df5437c764963cf536 Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.314113 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.334184 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fedc6c2d995571c787cea472a1388bf5fe7cde27025f6238c3fa35df8b3bd85d WatchSource:0}: Error finding container fedc6c2d995571c787cea472a1388bf5fe7cde27025f6238c3fa35df8b3bd85d: Status 404 returned error can't find the container with id fedc6c2d995571c787cea472a1388bf5fe7cde27025f6238c3fa35df8b3bd85d Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.338205 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6d0919a677cc86be888fe2be4bfd8e2c962248b6007b9168d8f3dde69c01f166 WatchSource:0}: Error finding container 6d0919a677cc86be888fe2be4bfd8e2c962248b6007b9168d8f3dde69c01f166: Status 404 returned error can't find the container with id 6d0919a677cc86be888fe2be4bfd8e2c962248b6007b9168d8f3dde69c01f166 Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.553898 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.556244 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.556306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.556319 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.556355 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:14 crc kubenswrapper[4958]: E1201 09:59:14.557121 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.216:6443: connect: connection refused" node="crc" Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.624794 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:14 crc kubenswrapper[4958]: E1201 09:59:14.624950 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.646318 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:14 crc kubenswrapper[4958]: W1201 09:59:14.667507 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:14 crc kubenswrapper[4958]: E1201 09:59:14.667626 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.810383 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6d0919a677cc86be888fe2be4bfd8e2c962248b6007b9168d8f3dde69c01f166"} Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.811963 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fedc6c2d995571c787cea472a1388bf5fe7cde27025f6238c3fa35df8b3bd85d"} Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.813248 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7bc7adcd32add3c65a7343e6a1845e1771c7b30dbd4fc7df5437c764963cf536"} Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.814325 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c1e62569b87f24fd5fbf16fa569d8eabb570d3a38eee823a9bd897f8dab079c"} Dec 01 09:59:14 crc kubenswrapper[4958]: I1201 09:59:14.815895 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1836988b1570f3fb5aa15e456b70f3df018a8897a92d27c557abf8b23ed46327"} Dec 01 09:59:15 crc kubenswrapper[4958]: E1201 09:59:15.081584 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="1.6s" Dec 01 09:59:15 crc kubenswrapper[4958]: W1201 09:59:15.137474 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:15 crc kubenswrapper[4958]: E1201 09:59:15.137629 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:15 crc kubenswrapper[4958]: W1201 09:59:15.175380 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:15 crc kubenswrapper[4958]: E1201 09:59:15.175519 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.357358 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.359198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.359244 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.359256 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.359287 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:15 crc kubenswrapper[4958]: E1201 09:59:15.359899 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.216:6443: connect: connection refused" node="crc" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.600175 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 09:59:15 crc kubenswrapper[4958]: E1201 09:59:15.601515 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.647821 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.820416 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d1df980aead296f448181df0902e898f7a8746008704d24f784f57229888f8d7"} Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.820465 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.820338 4958 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d1df980aead296f448181df0902e898f7a8746008704d24f784f57229888f8d7" exitCode=0 Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.821492 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.821524 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.821535 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.823526 4958 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856" exitCode=0 Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.823588 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856"} Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.823598 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.824904 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.824946 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.824957 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.826248 4958 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840" exitCode=0 Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.826285 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840"} Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.826330 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.827349 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.827379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.827389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.828900 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea"} Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.828939 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd"} Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.845690 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a" exitCode=0 Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.845758 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a"} Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.854442 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.856388 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.856464 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.856474 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.858519 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.860456 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.860503 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:15 crc kubenswrapper[4958]: I1201 09:59:15.860517 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:16 crc kubenswrapper[4958]: W1201 09:59:16.570098 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:16 crc kubenswrapper[4958]: E1201 09:59:16.570246 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.646242 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:16 crc kubenswrapper[4958]: E1201 09:59:16.682632 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="3.2s" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.850742 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.850858 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.853381 4958 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="99e96b3dafdcfd2cc7b869f3ede664a70b277c37a903a34e7c05fd5e6c7f6a95" exitCode=0 Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.853443 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"99e96b3dafdcfd2cc7b869f3ede664a70b277c37a903a34e7c05fd5e6c7f6a95"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.855761 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.855756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.856917 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.856949 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.856961 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.859359 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.859420 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.863102 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.863136 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e"} Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.863253 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.864598 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.864629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.864645 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.960058 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.961513 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.961585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.961600 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:16 crc kubenswrapper[4958]: I1201 09:59:16.961634 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:16 crc kubenswrapper[4958]: E1201 09:59:16.962203 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.216:6443: connect: connection refused" node="crc" Dec 01 09:59:17 crc kubenswrapper[4958]: W1201 09:59:17.027420 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:17 crc kubenswrapper[4958]: E1201 09:59:17.027607 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.300589 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.646715 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:17 crc kubenswrapper[4958]: W1201 09:59:17.711688 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:17 crc kubenswrapper[4958]: E1201 09:59:17.711832 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:17 crc kubenswrapper[4958]: W1201 09:59:17.726868 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:17 crc kubenswrapper[4958]: E1201 09:59:17.727004 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.216:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.867293 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2"} Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.867452 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.868292 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.868312 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.868321 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.870234 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.876270 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3"} Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.876390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53"} Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.876406 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524"} Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.876618 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.876642 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.876977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.877018 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.877038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.877057 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.890195 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.890289 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.890322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.890907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.890964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.890982 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.898972 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.899076 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:17 crc kubenswrapper[4958]: I1201 09:59:17.899092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.083137 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.099636 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.646690 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.216:6443: connect: connection refused Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.711034 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.876679 4958 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ce2b0ab17a6b71789f2942a9e5dceb7fa98337050ded9bc6a1348f1981609d45" exitCode=0 Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.876775 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ce2b0ab17a6b71789f2942a9e5dceb7fa98337050ded9bc6a1348f1981609d45"} Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.877403 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.878522 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.879580 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.879662 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.879684 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.880877 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3" exitCode=255 Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.880933 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3"} Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.881071 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.881025 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.881152 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.881276 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.882614 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.882652 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.882666 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.882665 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.882796 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.882822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.883570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.883615 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.883628 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:18 crc kubenswrapper[4958]: I1201 09:59:18.884174 4958 scope.go:117] "RemoveContainer" containerID="5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.221298 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.352769 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.586805 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.686570 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.888461 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"658d3c320acc05d153b36faf3f76b7aa57d7c1e42d8040c408e600a631271997"} Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.888559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2fd73c8838754119891285bfd8f555016ed9cc3dbcdea495da77ed623af9e664"} Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.888591 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7ca62006a519e2fea4606565a78dc20530be38d550eb25994fc499d463c322d9"} Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.892589 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.895494 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9"} Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.895598 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.895666 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.895690 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.895755 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897575 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897683 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897720 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897753 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:19 crc kubenswrapper[4958]: I1201 09:59:19.897762 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.163304 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.165007 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.165169 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.165203 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.165266 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.596171 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.906467 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.906567 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.907619 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.908337 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7abd8e6e1851431ef97c3788bdfb36b364c252b61bb7e2e6ff23068c0e766540"} Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.908411 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2e5acfa06456c10143d76e4a9852518fb73669a3d912f0813fdfbbdee15c5b7f"} Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.908576 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.910085 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.910140 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.910160 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.911481 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.911527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.911546 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.912645 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.912685 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:20 crc kubenswrapper[4958]: I1201 09:59:20.912704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.711352 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.711480 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.908695 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.908741 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.908715 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.909834 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.909900 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.909914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.909907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.909951 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:21 crc kubenswrapper[4958]: I1201 09:59:21.909964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.719527 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.719768 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.721309 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.721352 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.721366 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:23 crc kubenswrapper[4958]: E1201 09:59:23.855608 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.865104 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.865398 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.866689 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.866741 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:23 crc kubenswrapper[4958]: I1201 09:59:23.866766 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:24 crc kubenswrapper[4958]: I1201 09:59:24.077683 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 09:59:24 crc kubenswrapper[4958]: I1201 09:59:24.077985 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:24 crc kubenswrapper[4958]: I1201 09:59:24.079333 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:24 crc kubenswrapper[4958]: I1201 09:59:24.079382 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:24 crc kubenswrapper[4958]: I1201 09:59:24.079400 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:27 crc kubenswrapper[4958]: I1201 09:59:27.307281 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:27 crc kubenswrapper[4958]: I1201 09:59:27.307470 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:27 crc kubenswrapper[4958]: I1201 09:59:27.308745 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:27 crc kubenswrapper[4958]: I1201 09:59:27.308790 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:27 crc kubenswrapper[4958]: I1201 09:59:27.308805 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:29 crc kubenswrapper[4958]: I1201 09:59:29.222670 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:59:29 crc kubenswrapper[4958]: I1201 09:59:29.223426 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:59:29 crc kubenswrapper[4958]: I1201 09:59:29.646980 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 09:59:29 crc kubenswrapper[4958]: E1201 09:59:29.690419 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 09:59:29 crc kubenswrapper[4958]: E1201 09:59:29.884294 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 01 09:59:30 crc kubenswrapper[4958]: E1201 09:59:30.167496 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 09:59:30 crc kubenswrapper[4958]: W1201 09:59:30.213809 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 09:59:30 crc kubenswrapper[4958]: I1201 09:59:30.213977 4958 trace.go:236] Trace[1823811609]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:59:20.212) (total time: 10001ms): Dec 01 09:59:30 crc kubenswrapper[4958]: Trace[1823811609]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:59:30.213) Dec 01 09:59:30 crc kubenswrapper[4958]: Trace[1823811609]: [10.001585113s] [10.001585113s] END Dec 01 09:59:30 crc kubenswrapper[4958]: E1201 09:59:30.214003 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 09:59:30 crc kubenswrapper[4958]: I1201 09:59:30.596667 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:59:30 crc kubenswrapper[4958]: I1201 09:59:30.596763 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:59:30 crc kubenswrapper[4958]: I1201 09:59:30.697825 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 09:59:30 crc kubenswrapper[4958]: I1201 09:59:30.698002 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 09:59:31 crc kubenswrapper[4958]: I1201 09:59:31.722146 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:59:31 crc kubenswrapper[4958]: I1201 09:59:31.723198 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:59:33 crc kubenswrapper[4958]: I1201 09:59:33.743551 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 09:59:33 crc kubenswrapper[4958]: I1201 09:59:33.743775 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:33 crc kubenswrapper[4958]: I1201 09:59:33.745339 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:33 crc kubenswrapper[4958]: I1201 09:59:33.745405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:33 crc kubenswrapper[4958]: I1201 09:59:33.745427 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:33 crc kubenswrapper[4958]: I1201 09:59:33.755685 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 09:59:33 crc kubenswrapper[4958]: E1201 09:59:33.855994 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:59:34 crc kubenswrapper[4958]: I1201 09:59:34.016101 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:34 crc kubenswrapper[4958]: I1201 09:59:34.017288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:34 crc kubenswrapper[4958]: I1201 09:59:34.017354 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:34 crc kubenswrapper[4958]: I1201 09:59:34.017370 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.603971 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.604230 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.606345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.606482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.606584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.611804 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.694121 4958 trace.go:236] Trace[956676381]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:59:20.984) (total time: 14709ms): Dec 01 09:59:35 crc kubenswrapper[4958]: Trace[956676381]: ---"Objects listed" error: 14709ms (09:59:35.694) Dec 01 09:59:35 crc kubenswrapper[4958]: Trace[956676381]: [14.709950213s] [14.709950213s] END Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.694160 4958 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.694837 4958 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.694918 4958 trace.go:236] Trace[2130994375]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:59:23.336) (total time: 12358ms): Dec 01 09:59:35 crc kubenswrapper[4958]: Trace[2130994375]: ---"Objects listed" error: 12358ms (09:59:35.694) Dec 01 09:59:35 crc kubenswrapper[4958]: Trace[2130994375]: [12.358231635s] [12.358231635s] END Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.694949 4958 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.697154 4958 trace.go:236] Trace[699378991]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:59:21.048) (total time: 14648ms): Dec 01 09:59:35 crc kubenswrapper[4958]: Trace[699378991]: ---"Objects listed" error: 14648ms (09:59:35.696) Dec 01 09:59:35 crc kubenswrapper[4958]: Trace[699378991]: [14.648549324s] [14.648549324s] END Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.697187 4958 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.737404 4958 apiserver.go:52] "Watching apiserver" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.739954 4958 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.740326 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.740827 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.740827 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.740860 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60330->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.740978 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.740981 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60330->192.168.126.11:17697: read: connection reset by peer" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.741118 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:35 crc kubenswrapper[4958]: E1201 09:59:35.741197 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.741271 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:35 crc kubenswrapper[4958]: E1201 09:59:35.741398 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.741543 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:35 crc kubenswrapper[4958]: E1201 09:59:35.741604 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.741676 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.741779 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.743731 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.743777 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.743967 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.744112 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.744190 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.744113 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.744385 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.744438 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.744393 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.755977 4958 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.768177 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.783241 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.794076 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795498 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795534 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795557 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795576 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795592 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795612 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795626 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795645 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795762 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795777 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795795 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795818 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795833 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795867 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795884 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795899 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795915 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795932 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795947 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795966 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795982 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.795999 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796016 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796033 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796048 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796062 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796078 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796095 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796126 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796140 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796159 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796175 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796191 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796208 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796226 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796242 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796260 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796277 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796292 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796309 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796326 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796342 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796358 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796374 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796406 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796481 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796497 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796513 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796528 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796545 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796627 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796647 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796663 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796680 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796695 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796723 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796739 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796791 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796871 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796891 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796909 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796934 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796958 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796992 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797010 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797056 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797076 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797092 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797112 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797136 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797159 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797181 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797199 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797223 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797292 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797319 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797343 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797366 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797388 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797411 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797442 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797466 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797490 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797512 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797537 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797560 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797584 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797607 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797630 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797651 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797675 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797698 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797722 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797744 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797768 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797789 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.798643 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799062 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799121 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799165 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799209 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799251 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799289 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799327 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796991 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.796991 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797003 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.797824 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.798980 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799182 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799630 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799805 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.799956 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.800130 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.943442 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:35 crc kubenswrapper[4958]: E1201 09:59:35.800270 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:59:36.300238311 +0000 UTC m=+23.809027348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.943617 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.943658 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.943683 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945520 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945614 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945649 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945684 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945819 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945893 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945933 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946097 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946126 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946151 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946176 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946196 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946225 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946396 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946421 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946444 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946464 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946484 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946502 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946523 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946542 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946562 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946586 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946624 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946644 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946665 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945616 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.950311 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.945633 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.801297 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.801468 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.801573 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.801618 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.801658 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.801832 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.802140 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.802233 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.802301 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.802662 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.802676 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.802699 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.803114 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.803165 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.803929 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.805228 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.807761 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.807803 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.808695 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.809859 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.810430 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.810731 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.811233 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.811524 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.817482 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.935430 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.935470 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.935555 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.935900 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936006 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936039 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936096 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936114 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936180 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936192 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936243 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936303 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936467 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936520 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936689 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936682 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936871 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936925 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936973 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936375 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936389 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.936380 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.937373 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.937742 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.937753 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.937915 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.937351 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.937275 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938264 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938283 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938307 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938348 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938391 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938137 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938943 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.938953 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.939094 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.939682 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.939698 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.939520 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.939783 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.940628 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.940802 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.941062 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.941243 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.941264 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.941591 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.941615 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.941922 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.942075 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.942158 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.942168 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.942305 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.943103 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.946542 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.947534 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.947536 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.948192 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.950930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.951405 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.800419 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.947392 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952119 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952164 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952193 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952227 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952259 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952282 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952307 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952328 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952369 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952416 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952449 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952470 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.952718 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.953094 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.954013 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.954229 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.955238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.955317 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.955437 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.955774 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.959709 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.959656 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.960068 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.960238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.960598 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.972228 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:35 crc kubenswrapper[4958]: I1201 09:59:35.976938 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.022279 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.022695 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.022702 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.023083 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.023261 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.023443 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.023711 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.024139 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.024178 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.024526 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.024557 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.024697 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.024858 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025305 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025479 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025554 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025590 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025620 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025647 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025675 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026378 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026426 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026485 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026542 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026569 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026602 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026624 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026646 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026668 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026709 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026734 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026754 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026776 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026798 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026822 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026868 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026896 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026917 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026936 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026953 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026987 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027003 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027057 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027074 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027094 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027127 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027143 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027192 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027219 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027244 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027267 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027306 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.027334 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025448 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.025995 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026455 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.026585 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.029144 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.029179 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.029302 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.029460 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.029499 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.029913 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030025 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030052 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030082 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030075 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030124 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030156 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030194 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030206 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030339 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030493 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.030564 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.035603 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.038087 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.038465 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.038840 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.038999 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.039420 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.039463 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.039545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.039972 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.040005 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.040074 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.040169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.040320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.040599 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.041112 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.041423 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.041868 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.041906 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.041993 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042574 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042151 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042323 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042773 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042817 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042818 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.042918 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043049 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043091 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043205 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043305 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043431 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043582 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043647 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.043944 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.044135 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.044255 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.044281 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.045062 4958 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.045591 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.046366 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.046787 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.046980 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.047310 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.047707 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.048221 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.048325 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.048835 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.048915 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.049437 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.052239 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.052429 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.052539 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.054369 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.054614 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.054675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.054645 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.054789 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.054817 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055007 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055192 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.055471 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055648 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055708 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055828 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055836 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.055959 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.056008 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.056729 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.056907 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:36.556876797 +0000 UTC m=+24.065665894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.056996 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:36.556962129 +0000 UTC m=+24.065751186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057126 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057123 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057197 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057372 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057429 4958 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057451 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057472 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057493 4958 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057511 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057528 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057545 4958 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057561 4958 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057578 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057596 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057613 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057645 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057662 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057680 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057612 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057700 4958 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057718 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057688 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057736 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057818 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057912 4958 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.057938 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058085 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058116 4958 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058134 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058191 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058212 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058230 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058288 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058305 4958 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058323 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058379 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058399 4958 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058417 4958 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058473 4958 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058496 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058513 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058569 4958 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058592 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058648 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058675 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058696 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058752 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058771 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058787 4958 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.058826 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.059457 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060045 4958 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060100 4958 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060120 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060140 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060157 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060175 4958 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060193 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060210 4958 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060227 4958 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060243 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060261 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060278 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060296 4958 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060315 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060332 4958 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060350 4958 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060367 4958 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060382 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060400 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060425 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060443 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060463 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060481 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060498 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060516 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060541 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060558 4958 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060578 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060595 4958 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060617 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060636 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060654 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060688 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060706 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060723 4958 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060739 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060754 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060770 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060787 4958 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060805 4958 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060820 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060841 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.060978 4958 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061037 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061056 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061072 4958 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061129 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061145 4958 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061165 4958 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061223 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061241 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061257 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061341 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061364 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061383 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061443 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061467 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061493 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061512 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061531 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061550 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061619 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061637 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061654 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061672 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061692 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061709 4958 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.061979 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.063554 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.064344 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.064812 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065407 4958 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065444 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065458 4958 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065471 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065483 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065497 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065511 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065524 4958 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065537 4958 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065549 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065595 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065609 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065621 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065633 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065649 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065661 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065672 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065684 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065699 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065715 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.065731 4958 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.068566 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.069889 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.074533 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.076776 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.078320 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.079144 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.080044 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.080448 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.080693 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.080725 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.080744 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.080831 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:36.580809823 +0000 UTC m=+24.089598900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.081932 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.082015 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.082076 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.082188 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:36.582119651 +0000 UTC m=+24.090908718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.083733 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.084234 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.084593 4958 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.084744 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.086774 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.087453 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.088580 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.090694 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.090994 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.092378 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.092731 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.093766 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.094484 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.096724 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.097397 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.098164 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.101617 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.103494 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.103614 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.103603 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.104461 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.105599 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.106203 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.107750 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.108311 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.109221 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.109765 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.110435 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.111753 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.112370 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.113067 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.114389 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.166995 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167059 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167160 4958 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167186 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167185 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167207 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167281 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167298 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167312 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167333 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167386 4958 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167398 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167409 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167391 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167420 4958 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167573 4958 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167591 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167605 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167620 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167637 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167653 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167679 4958 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167691 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167702 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167729 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167752 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167772 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167782 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167792 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167802 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167811 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167821 4958 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167830 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167860 4958 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167871 4958 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167884 4958 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167903 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167915 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167925 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167934 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167942 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167957 4958 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.167980 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168000 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168020 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168030 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168042 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168052 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168065 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168081 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168100 4958 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168111 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168121 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168130 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168144 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168163 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168175 4958 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168184 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168194 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168215 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168234 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.168253 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.359968 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.369070 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.372486 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.372709 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:59:37.372679682 +0000 UTC m=+24.881468719 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:59:36 crc kubenswrapper[4958]: W1201 09:59:36.374454 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7de8e6d9491041e4d54ec705b732e156b4ce4b8a74bcfcd6b7e4f6d6e7c8e06f WatchSource:0}: Error finding container 7de8e6d9491041e4d54ec705b732e156b4ce4b8a74bcfcd6b7e4f6d6e7c8e06f: Status 404 returned error can't find the container with id 7de8e6d9491041e4d54ec705b732e156b4ce4b8a74bcfcd6b7e4f6d6e7c8e06f Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.376191 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:59:36 crc kubenswrapper[4958]: W1201 09:59:36.442867 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6c7c8fe749faa6db7bb2500d8361248f1ae38fd3882f0d34af10ddaa2efa6f07 WatchSource:0}: Error finding container 6c7c8fe749faa6db7bb2500d8361248f1ae38fd3882f0d34af10ddaa2efa6f07: Status 404 returned error can't find the container with id 6c7c8fe749faa6db7bb2500d8361248f1ae38fd3882f0d34af10ddaa2efa6f07 Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.568202 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.572682 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.572750 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.572763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.572871 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.575471 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.575637 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.575687 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.575831 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.575875 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:37.575820832 +0000 UTC m=+25.084609869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.575943 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:37.575924555 +0000 UTC m=+25.084713592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.579630 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.677154 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:36 crc kubenswrapper[4958]: I1201 09:59:36.677255 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677522 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677547 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677563 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677655 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:37.677623693 +0000 UTC m=+25.186412730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677785 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677798 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677818 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:36 crc kubenswrapper[4958]: E1201 09:59:36.677874 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:37.67786363 +0000 UTC m=+25.186652667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.059898 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.060461 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.062508 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9" exitCode=255 Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.062551 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.062808 4958 scope.go:117] "RemoveContainer" containerID="5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.063894 4958 scope.go:117] "RemoveContainer" containerID="60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9" Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.064235 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.066241 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.066324 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.066338 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6c7c8fe749faa6db7bb2500d8361248f1ae38fd3882f0d34af10ddaa2efa6f07"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.067681 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8583ee40ff1ce02949f5bf3f3f7af86ca029fc9b7977a254a1b20c69f7d061b4"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.069212 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.069288 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7de8e6d9491041e4d54ec705b732e156b4ce4b8a74bcfcd6b7e4f6d6e7c8e06f"} Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.143717 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.159108 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.172855 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.208606 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:18Z\\\",\\\"message\\\":\\\"W1201 09:59:18.151764 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1201 09:59:18.152433 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764583158 cert, and key in /tmp/serving-cert-2832760921/serving-signer.crt, /tmp/serving-cert-2832760921/serving-signer.key\\\\nI1201 09:59:18.396518 1 observer_polling.go:159] Starting file observer\\\\nW1201 09:59:18.406323 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1201 09:59:18.406494 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:18.407341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2832760921/tls.crt::/tmp/serving-cert-2832760921/tls.key\\\\\\\"\\\\nF1201 09:59:18.764704 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.225974 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.243885 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.255630 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.267870 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.283960 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.299966 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.320685 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.332562 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.342598 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.362294 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:18Z\\\",\\\"message\\\":\\\"W1201 09:59:18.151764 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1201 09:59:18.152433 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764583158 cert, and key in /tmp/serving-cert-2832760921/serving-signer.crt, /tmp/serving-cert-2832760921/serving-signer.key\\\\nI1201 09:59:18.396518 1 observer_polling.go:159] Starting file observer\\\\nW1201 09:59:18.406323 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1201 09:59:18.406494 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:18.407341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2832760921/tls.crt::/tmp/serving-cert-2832760921/tls.key\\\\\\\"\\\\nF1201 09:59:18.764704 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.410523 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.410675 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:59:39.410648707 +0000 UTC m=+26.919437744 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.587551 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tsq6f"] Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.588004 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.606894 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.607639 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.607996 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.623788 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.623869 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.623966 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.624034 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:39.624014104 +0000 UTC m=+27.132803141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.624119 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.624141 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:39.624134317 +0000 UTC m=+27.132923354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.685753 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:37Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.730310 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmgd\" (UniqueName: \"kubernetes.io/projected/92bb0597-cb74-4cba-b6f6-e52266b1aa59-kube-api-access-2wmgd\") pod \"node-resolver-tsq6f\" (UID: \"92bb0597-cb74-4cba-b6f6-e52266b1aa59\") " pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.730376 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.730423 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.730445 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92bb0597-cb74-4cba-b6f6-e52266b1aa59-hosts-file\") pod \"node-resolver-tsq6f\" (UID: \"92bb0597-cb74-4cba-b6f6-e52266b1aa59\") " pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730620 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730645 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730659 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730725 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:39.730704197 +0000 UTC m=+27.239493234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730750 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730803 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730818 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.730926 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:39.730898733 +0000 UTC m=+27.239687770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.796531 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.796623 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.796657 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.796696 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.796818 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:37 crc kubenswrapper[4958]: E1201 09:59:37.796995 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.801221 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.802178 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.803539 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.804299 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.805361 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.805536 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.806254 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.807010 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.808227 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.809059 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.813074 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.813723 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.815007 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.815606 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.816275 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.817444 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.818067 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.819313 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.819338 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:37Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.819950 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.824324 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.825376 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.831041 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92bb0597-cb74-4cba-b6f6-e52266b1aa59-hosts-file\") pod \"node-resolver-tsq6f\" (UID: \"92bb0597-cb74-4cba-b6f6-e52266b1aa59\") " pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.831104 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmgd\" (UniqueName: \"kubernetes.io/projected/92bb0597-cb74-4cba-b6f6-e52266b1aa59-kube-api-access-2wmgd\") pod \"node-resolver-tsq6f\" (UID: \"92bb0597-cb74-4cba-b6f6-e52266b1aa59\") " pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.831461 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92bb0597-cb74-4cba-b6f6-e52266b1aa59-hosts-file\") pod \"node-resolver-tsq6f\" (UID: \"92bb0597-cb74-4cba-b6f6-e52266b1aa59\") " pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.844470 4958 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.906668 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmgd\" (UniqueName: \"kubernetes.io/projected/92bb0597-cb74-4cba-b6f6-e52266b1aa59-kube-api-access-2wmgd\") pod \"node-resolver-tsq6f\" (UID: \"92bb0597-cb74-4cba-b6f6-e52266b1aa59\") " pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.934701 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ec2263db2f3a9518841c82c036b91777a2276c8667eac1bdfb88d95b38025c3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:18Z\\\",\\\"message\\\":\\\"W1201 09:59:18.151764 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1201 09:59:18.152433 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764583158 cert, and key in /tmp/serving-cert-2832760921/serving-signer.crt, /tmp/serving-cert-2832760921/serving-signer.key\\\\nI1201 09:59:18.396518 1 observer_polling.go:159] Starting file observer\\\\nW1201 09:59:18.406323 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1201 09:59:18.406494 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:18.407341 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2832760921/tls.crt::/tmp/serving-cert-2832760921/tls.key\\\\\\\"\\\\nF1201 09:59:18.764704 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:37Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.967896 4958 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 09:59:37 crc kubenswrapper[4958]: I1201 09:59:37.977128 4958 csr.go:261] certificate signing request csr-ckm9f is approved, waiting to be issued Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.035903 4958 csr.go:257] certificate signing request csr-ckm9f is issued Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.084646 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.085155 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.087005 4958 scope.go:117] "RemoveContainer" containerID="60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9" Dec 01 09:59:38 crc kubenswrapper[4958]: E1201 09:59:38.087263 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.133046 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.163169 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.189361 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.205527 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tsq6f" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.212358 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: W1201 09:59:38.225320 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92bb0597_cb74_4cba_b6f6_e52266b1aa59.slice/crio-65de19cc1bda375d3033ca6b586cd0517f15d8d5ec65d7ff99f8e2b6d7555244 WatchSource:0}: Error finding container 65de19cc1bda375d3033ca6b586cd0517f15d8d5ec65d7ff99f8e2b6d7555244: Status 404 returned error can't find the container with id 65de19cc1bda375d3033ca6b586cd0517f15d8d5ec65d7ff99f8e2b6d7555244 Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.229345 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.262627 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.281508 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.319215 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.336675 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.350128 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.366275 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.416095 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.615052 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7z6wb"] Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.619527 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-prmw7"] Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.620141 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4vh77"] Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.620709 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.622028 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.622118 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629082 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629099 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629508 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629581 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629702 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629713 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629736 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629820 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.629876 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.630670 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.630879 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.631267 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.651988 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.679262 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.700320 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.725029 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.735465 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738269 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/00cc61ff-219a-40e4-a0c3-360c456c57f9-cni-binary-copy\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738339 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-os-release\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738385 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-cnibin\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-etc-kubernetes\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738567 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46276a58-9607-4a8a-bcfc-ca41ab441ec2-cni-binary-copy\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738597 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-k8s-cni-cncf-io\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738697 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09a41414-b5bf-481a-afdc-b0042f4c78b0-mcd-auth-proxy-config\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738757 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-kubelet\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738779 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a41414-b5bf-481a-afdc-b0042f4c78b0-proxy-tls\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738797 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-system-cni-dir\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738812 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-cni-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738857 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-cni-multus\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738895 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-hostroot\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738932 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-os-release\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-cni-bin\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738969 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvm9p\" (UniqueName: \"kubernetes.io/projected/46276a58-9607-4a8a-bcfc-ca41ab441ec2-kube-api-access-lvm9p\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.738987 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-daemon-config\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739007 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-system-cni-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739028 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/09a41414-b5bf-481a-afdc-b0042f4c78b0-rootfs\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739046 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/00cc61ff-219a-40e4-a0c3-360c456c57f9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739087 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7v5\" (UniqueName: \"kubernetes.io/projected/00cc61ff-219a-40e4-a0c3-360c456c57f9-kube-api-access-nr7v5\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739106 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-cnibin\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739127 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-netns\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlxf\" (UniqueName: \"kubernetes.io/projected/09a41414-b5bf-481a-afdc-b0042f4c78b0-kube-api-access-6qlxf\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739158 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-conf-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739189 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-socket-dir-parent\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739204 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-multus-certs\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.739226 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.761309 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.793312 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.821041 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.839889 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-daemon-config\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.839963 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvm9p\" (UniqueName: \"kubernetes.io/projected/46276a58-9607-4a8a-bcfc-ca41ab441ec2-kube-api-access-lvm9p\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.839988 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-system-cni-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840012 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/09a41414-b5bf-481a-afdc-b0042f4c78b0-rootfs\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/00cc61ff-219a-40e4-a0c3-360c456c57f9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840047 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7v5\" (UniqueName: \"kubernetes.io/projected/00cc61ff-219a-40e4-a0c3-360c456c57f9-kube-api-access-nr7v5\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840061 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-cnibin\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840077 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-netns\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840123 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlxf\" (UniqueName: \"kubernetes.io/projected/09a41414-b5bf-481a-afdc-b0042f4c78b0-kube-api-access-6qlxf\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840160 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-system-cni-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840174 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-socket-dir-parent\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840233 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-conf-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840253 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840269 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-multus-certs\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840285 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/00cc61ff-219a-40e4-a0c3-360c456c57f9-cni-binary-copy\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840300 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-os-release\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840329 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-cnibin\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840345 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-etc-kubernetes\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840360 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-socket-dir-parent\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840368 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46276a58-9607-4a8a-bcfc-ca41ab441ec2-cni-binary-copy\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840391 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-k8s-cni-cncf-io\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840420 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09a41414-b5bf-481a-afdc-b0042f4c78b0-mcd-auth-proxy-config\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840438 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-system-cni-dir\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840484 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-cni-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840499 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-cni-multus\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840514 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-kubelet\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840532 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a41414-b5bf-481a-afdc-b0042f4c78b0-proxy-tls\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840551 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-hostroot\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840568 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-os-release\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840584 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-cni-bin\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840578 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/09a41414-b5bf-481a-afdc-b0042f4c78b0-rootfs\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840629 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-cni-bin\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840723 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-cnibin\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.840989 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-cnibin\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841258 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-os-release\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841356 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-netns\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841386 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/00cc61ff-219a-40e4-a0c3-360c456c57f9-cni-binary-copy\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841410 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-daemon-config\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841451 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/00cc61ff-219a-40e4-a0c3-360c456c57f9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841464 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-multus-certs\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841475 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-etc-kubernetes\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841501 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-run-k8s-cni-cncf-io\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841538 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46276a58-9607-4a8a-bcfc-ca41ab441ec2-cni-binary-copy\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841546 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-kubelet\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841598 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-os-release\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841604 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-host-var-lib-cni-multus\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841616 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-conf-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841633 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-hostroot\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841638 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-system-cni-dir\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.841678 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46276a58-9607-4a8a-bcfc-ca41ab441ec2-multus-cni-dir\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.842051 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/00cc61ff-219a-40e4-a0c3-360c456c57f9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.842140 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/09a41414-b5bf-481a-afdc-b0042f4c78b0-mcd-auth-proxy-config\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.847937 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09a41414-b5bf-481a-afdc-b0042f4c78b0-proxy-tls\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.853754 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.875688 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7v5\" (UniqueName: \"kubernetes.io/projected/00cc61ff-219a-40e4-a0c3-360c456c57f9-kube-api-access-nr7v5\") pod \"multus-additional-cni-plugins-4vh77\" (UID: \"00cc61ff-219a-40e4-a0c3-360c456c57f9\") " pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.894391 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvm9p\" (UniqueName: \"kubernetes.io/projected/46276a58-9607-4a8a-bcfc-ca41ab441ec2-kube-api-access-lvm9p\") pod \"multus-7z6wb\" (UID: \"46276a58-9607-4a8a-bcfc-ca41ab441ec2\") " pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.900608 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlxf\" (UniqueName: \"kubernetes.io/projected/09a41414-b5bf-481a-afdc-b0042f4c78b0-kube-api-access-6qlxf\") pod \"machine-config-daemon-prmw7\" (UID: \"09a41414-b5bf-481a-afdc-b0042f4c78b0\") " pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.901088 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.921814 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.933397 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.939784 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.946284 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.953130 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4vh77" Dec 01 09:59:38 crc kubenswrapper[4958]: W1201 09:59:38.960308 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a41414_b5bf_481a_afdc_b0042f4c78b0.slice/crio-0303dedcc24d385aef7140b7d3155b39b9543b3f3f1ac7cc55b855dcc2b10806 WatchSource:0}: Error finding container 0303dedcc24d385aef7140b7d3155b39b9543b3f3f1ac7cc55b855dcc2b10806: Status 404 returned error can't find the container with id 0303dedcc24d385aef7140b7d3155b39b9543b3f3f1ac7cc55b855dcc2b10806 Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.962192 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7z6wb" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.970737 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:38 crc kubenswrapper[4958]: I1201 09:59:38.986186 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.004970 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.019675 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.039332 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-01 09:54:37 +0000 UTC, rotation deadline is 2026-09-05 03:20:03.341612268 +0000 UTC Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.039418 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6665h20m24.302197411s for next certificate rotation Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.039662 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.041280 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-976fz"] Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.044184 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049023 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049053 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049087 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049170 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049279 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049328 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.049545 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.060259 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.078543 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.093740 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tsq6f" event={"ID":"92bb0597-cb74-4cba-b6f6-e52266b1aa59","Type":"ContainerStarted","Data":"8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768"} Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.093819 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tsq6f" event={"ID":"92bb0597-cb74-4cba-b6f6-e52266b1aa59","Type":"ContainerStarted","Data":"65de19cc1bda375d3033ca6b586cd0517f15d8d5ec65d7ff99f8e2b6d7555244"} Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.095136 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerStarted","Data":"d5fc2e060868232f0861d754397ec6db1378644017b4957e6a1f1e4c023eac75"} Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.095165 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.097920 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"3d01e367bd93ce51f0c7c58278afb64ad7bc70fb2ba41fbbfd9a703312ec659f"} Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.100825 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"0303dedcc24d385aef7140b7d3155b39b9543b3f3f1ac7cc55b855dcc2b10806"} Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.108409 4958 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.115997 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.131618 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150479 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-ovn-kubernetes\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150541 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovn-node-metrics-cert\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150559 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-env-overrides\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150597 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-bin\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150614 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-slash\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150630 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-node-log\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150651 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-log-socket\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150683 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-var-lib-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150712 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-ovn\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150733 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-etc-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150749 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-script-lib\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150783 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-systemd\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150829 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-kubelet\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150870 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-netns\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150888 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpxpt\" (UniqueName: \"kubernetes.io/projected/96173cf0-4be1-4ef7-b063-4c93c1731c20-kube-api-access-rpxpt\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150905 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-systemd-units\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150934 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150963 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-netd\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.150988 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-config\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.154535 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.174258 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.194769 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.217163 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.221496 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.222423 4958 scope.go:117] "RemoveContainer" containerID="60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9" Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.222605 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.239828 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252310 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-env-overrides\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252382 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-bin\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252407 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-slash\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252431 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-node-log\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252449 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-log-socket\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252475 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-var-lib-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252496 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-ovn\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252519 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-etc-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252541 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-script-lib\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252562 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-systemd\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252582 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252613 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-kubelet\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252647 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-netns\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252682 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpxpt\" (UniqueName: \"kubernetes.io/projected/96173cf0-4be1-4ef7-b063-4c93c1731c20-kube-api-access-rpxpt\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252713 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-systemd-units\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252738 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252766 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-netd\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252786 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-config\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252830 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-ovn-kubernetes\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.252946 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovn-node-metrics-cert\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.253648 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-systemd\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.253786 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-systemd-units\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.253820 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-log-socket\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.253860 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-slash\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.253913 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254104 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-bin\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254201 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-netns\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254242 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-kubelet\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254225 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-ovn-kubernetes\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254280 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-netd\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254315 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-etc-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254296 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254343 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-ovn\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254353 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-var-lib-openvswitch\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254313 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-env-overrides\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.253946 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-node-log\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.254771 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-config\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.255246 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-script-lib\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.259328 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovn-node-metrics-cert\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.268424 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.280674 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpxpt\" (UniqueName: \"kubernetes.io/projected/96173cf0-4be1-4ef7-b063-4c93c1731c20-kube-api-access-rpxpt\") pod \"ovnkube-node-976fz\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.285130 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.310697 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.326823 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.342340 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.360494 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.367731 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.417503 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.450623 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.454227 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.454949 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:59:43.454469561 +0000 UTC m=+30.963258598 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.480972 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:39Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.658042 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.658107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.658216 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.658276 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:43.65826176 +0000 UTC m=+31.167050797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.658570 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.658608 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:43.65860054 +0000 UTC m=+31.167389567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.759379 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.759445 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.759602 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.759619 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.759631 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.759676 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:43.759661718 +0000 UTC m=+31.268450745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.760068 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.760079 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.760088 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.760112 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:43.760104551 +0000 UTC m=+31.268893578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.799180 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.799337 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.799461 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:39 crc kubenswrapper[4958]: I1201 09:59:39.799704 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.799827 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:39 crc kubenswrapper[4958]: E1201 09:59:39.799924 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.104809 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46"} Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.107204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525"} Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.107232 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214"} Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.108420 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" exitCode=0 Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.108468 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937"} Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.108490 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"e7ceaec1b97d30313c5fb972e729c451f24391d68e54738ccf7806c7abf1c830"} Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.110315 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerStarted","Data":"7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c"} Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.123302 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.136209 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.145585 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.158738 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.175055 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.192219 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.215097 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.234161 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.251414 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.340500 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.355383 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.372003 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.386072 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.405248 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.424299 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.436224 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.447346 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.461068 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.508945 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.554023 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.569595 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.583807 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.597500 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.616062 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.631785 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:40 crc kubenswrapper[4958]: I1201 09:59:40.647414 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:40Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.222719 4958 generic.go:334] "Generic (PLEG): container finished" podID="00cc61ff-219a-40e4-a0c3-360c456c57f9" containerID="336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46" exitCode=0 Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.222800 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerDied","Data":"336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46"} Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.227482 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b"} Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.227577 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218"} Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.227598 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744"} Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.227615 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6"} Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.249767 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.265823 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.276987 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.294563 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.310657 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.326138 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.340253 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.352443 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.381411 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.400587 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.417162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.446032 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.462012 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.799486 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:41 crc kubenswrapper[4958]: E1201 09:59:41.799619 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.800040 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:41 crc kubenswrapper[4958]: E1201 09:59:41.800101 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.800141 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:41 crc kubenswrapper[4958]: E1201 09:59:41.800204 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.907138 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-htfxn"] Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.907607 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.913829 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.914130 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.916434 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.916554 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.929748 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.941705 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.960513 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.981657 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:41 crc kubenswrapper[4958]: I1201 09:59:41.993313 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.007406 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz266\" (UniqueName: \"kubernetes.io/projected/97f3ade3-c19a-49ac-a22b-3a4348f374f2-kube-api-access-lz266\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.007465 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97f3ade3-c19a-49ac-a22b-3a4348f374f2-host\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.007545 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/97f3ade3-c19a-49ac-a22b-3a4348f374f2-serviceca\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.013058 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.025559 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.041659 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.058308 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.074564 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.089432 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.102914 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.108460 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/97f3ade3-c19a-49ac-a22b-3a4348f374f2-serviceca\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.108509 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz266\" (UniqueName: \"kubernetes.io/projected/97f3ade3-c19a-49ac-a22b-3a4348f374f2-kube-api-access-lz266\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.108526 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97f3ade3-c19a-49ac-a22b-3a4348f374f2-host\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.108612 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/97f3ade3-c19a-49ac-a22b-3a4348f374f2-host\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.110360 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/97f3ade3-c19a-49ac-a22b-3a4348f374f2-serviceca\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.119343 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.137033 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.138415 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz266\" (UniqueName: \"kubernetes.io/projected/97f3ade3-c19a-49ac-a22b-3a4348f374f2-kube-api-access-lz266\") pod \"node-ca-htfxn\" (UID: \"97f3ade3-c19a-49ac-a22b-3a4348f374f2\") " pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.235909 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524"} Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.243628 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40"} Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.254505 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.272782 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.289585 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.307014 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.321964 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.336522 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-htfxn" Dec 01 09:59:42 crc kubenswrapper[4958]: W1201 09:59:42.356231 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f3ade3_c19a_49ac_a22b_3a4348f374f2.slice/crio-b2ef186724ba6f5b1ab885e2a5e50ece883a7adffd99cd00834d8661ca8ab14b WatchSource:0}: Error finding container b2ef186724ba6f5b1ab885e2a5e50ece883a7adffd99cd00834d8661ca8ab14b: Status 404 returned error can't find the container with id b2ef186724ba6f5b1ab885e2a5e50ece883a7adffd99cd00834d8661ca8ab14b Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.361395 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.379150 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.398294 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.417358 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.440835 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.457621 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.478941 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.509960 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:42 crc kubenswrapper[4958]: I1201 09:59:42.527373 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:42Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.249991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-htfxn" event={"ID":"97f3ade3-c19a-49ac-a22b-3a4348f374f2","Type":"ContainerStarted","Data":"b2ef186724ba6f5b1ab885e2a5e50ece883a7adffd99cd00834d8661ca8ab14b"} Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.525063 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.525959 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:59:51.525923148 +0000 UTC m=+39.034712195 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.531557 4958 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.579860 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.590315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.590405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.590420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.590611 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.599531 4958 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.599713 4958 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.600819 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.600871 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.600887 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.600906 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.600928 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.620588 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.625503 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.625538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.625550 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.625570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.625581 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.640466 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.644424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.644496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.644512 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.644535 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.644549 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.659240 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.663895 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.663985 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.664004 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.664032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.664050 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.681230 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.694738 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.694797 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.694808 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.694830 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.694855 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.707289 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.707665 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.709586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.709635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.709646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.709666 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.709675 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.727598 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.727660 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.727792 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.727868 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.727899 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:51.727881893 +0000 UTC m=+39.236670920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.727970 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:51.727944754 +0000 UTC m=+39.236733781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.803661 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.803718 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.803663 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.803822 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.803995 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.806538 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.816801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.816875 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.816886 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.816900 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.816919 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.819607 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.828505 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.828583 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.828798 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.828875 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.828896 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.828969 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:51.828945763 +0000 UTC m=+39.337734810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.828811 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.829326 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.829342 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:43 crc kubenswrapper[4958]: E1201 09:59:43.829381 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:59:51.829369915 +0000 UTC m=+39.338158962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.829704 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.842082 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.862398 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.882454 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.895146 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.911093 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.920449 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.920514 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.920527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.920548 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.920558 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:43Z","lastTransitionTime":"2025-12-01T09:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.930458 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.947217 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.961171 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:43 crc kubenswrapper[4958]: I1201 09:59:43.979929 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.000883 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.015133 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.023705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.023756 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.023767 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.023787 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.023802 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.032806 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.126853 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.126901 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.126909 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.126931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.126943 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.230768 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.230857 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.230883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.230912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.230929 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.255229 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.261619 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.263768 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-htfxn" event={"ID":"97f3ade3-c19a-49ac-a22b-3a4348f374f2","Type":"ContainerStarted","Data":"c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.277241 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.290134 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.303525 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.320418 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.334788 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.338634 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.338810 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.338822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.338984 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.339013 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.351621 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.367117 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.379457 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.393415 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.410701 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.427237 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.441813 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.441897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.441910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.441931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.441944 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.450550 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.462631 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.479171 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.496217 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.513515 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.526965 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.544881 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.545194 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.545219 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.545228 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.545248 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.545260 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.560403 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.574183 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.587711 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.600573 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.618915 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.634026 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.649068 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.650126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.650171 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.650184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.650205 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.650220 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.701024 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.742687 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.753414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.753495 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.753520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.753544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.753558 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.758550 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:44Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.856563 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.856627 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.856637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.856656 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.856667 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.959472 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.959536 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.959558 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.959584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:44 crc kubenswrapper[4958]: I1201 09:59:44.959615 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:44Z","lastTransitionTime":"2025-12-01T09:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.062208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.062271 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.062281 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.062300 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.062312 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.165511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.165632 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.165658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.165693 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.165720 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.267482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.267535 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.267544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.267561 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.267572 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.270326 4958 generic.go:334] "Generic (PLEG): container finished" podID="00cc61ff-219a-40e4-a0c3-360c456c57f9" containerID="7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524" exitCode=0 Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.270423 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerDied","Data":"7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.293759 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.622014 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.626509 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.626560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.626576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.626595 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.626607 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.636112 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.652201 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.667966 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.685986 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.701683 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.718033 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.728996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.729054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.729072 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.729093 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.729107 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.744933 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.759654 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.775022 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.789955 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.797496 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.797643 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.797778 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:45 crc kubenswrapper[4958]: E1201 09:59:45.797653 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:45 crc kubenswrapper[4958]: E1201 09:59:45.798216 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:45 crc kubenswrapper[4958]: E1201 09:59:45.797907 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.807346 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.818227 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.831585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.831641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.831654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.831674 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.831684 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.937381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.937427 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.937441 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.937461 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:45 crc kubenswrapper[4958]: I1201 09:59:45.937478 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:45Z","lastTransitionTime":"2025-12-01T09:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.041388 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.041441 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.041457 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.041478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.041492 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.151091 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.151207 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.151226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.151250 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.151272 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.317554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.317619 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.317633 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.317653 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.317673 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.321626 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.336261 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.347365 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.364026 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.392082 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.406247 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.420186 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.420245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.420261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.420282 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.420298 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.423065 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.435736 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.454426 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.474295 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.493636 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.508022 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.523816 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.523946 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.523993 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.524006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.524025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.524039 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.540448 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.557305 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.626496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.626542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.626554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.626572 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.626583 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.729669 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.729756 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.729777 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.729815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.729834 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.832791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.832891 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.832910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.832938 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:46 crc kubenswrapper[4958]: I1201 09:59:46.832956 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:46.935797 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:46.935877 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:46.935894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:46.935917 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:46.935931 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:46Z","lastTransitionTime":"2025-12-01T09:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.039269 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.039334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.039347 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.039370 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.039386 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.142522 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.142572 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.142585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.142604 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.142617 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.246227 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.246278 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.246290 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.246312 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.246322 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.338206 4958 generic.go:334] "Generic (PLEG): container finished" podID="00cc61ff-219a-40e4-a0c3-360c456c57f9" containerID="c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2" exitCode=0 Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.338289 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerDied","Data":"c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.343821 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.348239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.348328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.348354 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.348391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.348436 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.355749 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.381053 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.394259 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.408123 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.424504 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.438868 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.452420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.452491 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.452512 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.452536 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.452549 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.452472 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.471877 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.534643 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.548275 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.556950 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.556984 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.556996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.557018 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.557031 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.560616 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.572668 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.590636 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.607779 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:47Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.660198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.660239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.660252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.660274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.660286 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.764576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.764641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.764654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.764678 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.764691 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.797066 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.797119 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.797218 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:47 crc kubenswrapper[4958]: E1201 09:59:47.797392 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:47 crc kubenswrapper[4958]: E1201 09:59:47.797515 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:47 crc kubenswrapper[4958]: E1201 09:59:47.797864 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.868521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.868569 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.868584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.868602 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.868616 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.973307 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.973366 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.973386 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.973414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:47 crc kubenswrapper[4958]: I1201 09:59:47.973437 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:47Z","lastTransitionTime":"2025-12-01T09:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.076803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.076868 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.076886 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.076911 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.076923 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.179717 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.179761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.179773 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.179791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.179801 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.282632 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.282681 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.282693 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.282711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.282723 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.354665 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.371760 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.387482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.387538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.387552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.387590 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.387609 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.389899 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.404647 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.429914 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.443180 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.459024 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.471088 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.489337 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.490670 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.490718 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.490736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.490758 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.490771 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.509596 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.531131 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.547786 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.571887 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.588893 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.598775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.598860 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.598875 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.598898 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.598916 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.612287 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:48Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.702096 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.702181 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.702212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.702229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.702242 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.805039 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.805094 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.805109 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.805139 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.805163 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.907605 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.907680 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.907702 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.907728 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:48 crc kubenswrapper[4958]: I1201 09:59:48.907744 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:48Z","lastTransitionTime":"2025-12-01T09:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.010005 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.010044 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.010052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.010066 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.010076 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.113525 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.113591 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.113642 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.113673 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.113690 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.217065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.217136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.217157 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.217180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.217197 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.320227 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.320267 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.320277 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.320292 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.320302 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.362957 4958 generic.go:334] "Generic (PLEG): container finished" podID="00cc61ff-219a-40e4-a0c3-360c456c57f9" containerID="ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173" exitCode=0 Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.363072 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerDied","Data":"ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.418726 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.419704 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.419783 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.419816 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.426568 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.426635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.426649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.426670 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.426684 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.441180 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.454806 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.456724 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.460999 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.474997 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.491162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.508940 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.525667 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.532477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.532532 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.532560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.532604 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.532632 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.543592 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.560139 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.575660 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.595625 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.612051 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.627459 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.647318 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.652740 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.652795 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.652809 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.652944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.652966 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.662062 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.679793 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.697547 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.714091 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.729826 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.744694 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.756399 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.756469 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.756483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.756507 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.756524 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.760984 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.780799 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.794573 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.811334 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.826647 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.843822 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.859255 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.861231 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.861241 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.861343 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:49 crc kubenswrapper[4958]: E1201 09:59:49.861359 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:49 crc kubenswrapper[4958]: E1201 09:59:49.861463 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:49 crc kubenswrapper[4958]: E1201 09:59:49.861545 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.863078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.863127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.863144 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.863372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.863388 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.876963 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.889288 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:49Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.968027 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.968080 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.968092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.968107 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:49 crc kubenswrapper[4958]: I1201 09:59:49.968119 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:49Z","lastTransitionTime":"2025-12-01T09:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.071566 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.071606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.071617 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.071636 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.071646 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.174177 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.174306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.174327 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.174344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.174354 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.278610 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.278661 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.278675 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.278690 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.278710 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.382088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.382163 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.382176 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.382198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.382210 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.427929 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.452478 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.466881 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.484798 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.485669 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.485738 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.485761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.485783 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.485797 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.512898 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.529918 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.547812 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.562304 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.575772 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.590529 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.590576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.590587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.590606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.590616 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.595753 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.611546 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.632389 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.647140 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.661948 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.676516 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.693598 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.693645 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.693655 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.693676 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.693691 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.796775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.796863 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.796879 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.796900 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.796911 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.797488 4958 scope.go:117] "RemoveContainer" containerID="60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.901438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.902119 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.902150 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.902187 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.902209 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:50Z","lastTransitionTime":"2025-12-01T09:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.990652 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk"] Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.991786 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.994090 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 09:59:50 crc kubenswrapper[4958]: I1201 09:59:50.994283 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.005935 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.005968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.005977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.005993 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.006004 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.007054 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.023102 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.032783 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.046157 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.058337 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.058446 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.058497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.058532 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbglv\" (UniqueName: \"kubernetes.io/projected/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-kube-api-access-tbglv\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.066140 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.109362 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.109424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.109438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.109461 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.109480 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.111772 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.126630 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.137691 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.148724 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.159514 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.159586 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.159618 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbglv\" (UniqueName: \"kubernetes.io/projected/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-kube-api-access-tbglv\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.159651 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.160631 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.160650 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.165908 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.165868 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.181081 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbglv\" (UniqueName: \"kubernetes.io/projected/1f88031e-1c6c-4d5a-9648-a64ec5c5147f-kube-api-access-tbglv\") pod \"ovnkube-control-plane-749d76644c-rd8vk\" (UID: \"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.186637 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.212736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.212791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.212811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.212833 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.212865 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.213333 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.235020 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.246591 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.257282 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.315716 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.316612 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.316687 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.316715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.316749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.316773 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: W1201 09:59:51.333014 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f88031e_1c6c_4d5a_9648_a64ec5c5147f.slice/crio-1a0352a61757ec43cdabc5410a12aaa10bbf37af055d7e3a980afbca17d7de17 WatchSource:0}: Error finding container 1a0352a61757ec43cdabc5410a12aaa10bbf37af055d7e3a980afbca17d7de17: Status 404 returned error can't find the container with id 1a0352a61757ec43cdabc5410a12aaa10bbf37af055d7e3a980afbca17d7de17 Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.421916 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.421981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.422002 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.422027 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.422039 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.434834 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" event={"ID":"1f88031e-1c6c-4d5a-9648-a64ec5c5147f","Type":"ContainerStarted","Data":"1a0352a61757ec43cdabc5410a12aaa10bbf37af055d7e3a980afbca17d7de17"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.526257 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.526316 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.526326 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.526368 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.526379 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.563206 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.563453 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:00:07.563389096 +0000 UTC m=+55.072178163 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.630151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.630222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.630236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.630462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.630474 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.733666 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.733715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.733726 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.733743 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.733753 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.766328 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.766403 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.766527 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.766599 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:07.766579177 +0000 UTC m=+55.275368214 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.767136 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.767296 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:07.767266207 +0000 UTC m=+55.276055254 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.797155 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.797416 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.798256 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.798334 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.798493 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.798640 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.836952 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.836994 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.837006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.837024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.837034 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.867186 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.867270 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867395 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867428 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867445 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867517 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:07.867493802 +0000 UTC m=+55.376282839 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867395 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867542 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867555 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:51 crc kubenswrapper[4958]: E1201 09:59:51.867584 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:07.867574285 +0000 UTC m=+55.376363322 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.940167 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.940214 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.940226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.940244 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:51 crc kubenswrapper[4958]: I1201 09:59:51.940256 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:51Z","lastTransitionTime":"2025-12-01T09:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.043733 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.043826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.043990 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.044061 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.044090 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.147570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.147624 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.147635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.147654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.147667 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.251019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.251103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.251126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.251165 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.251187 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.353757 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.353828 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.353869 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.353897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.353912 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.457066 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.457118 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.457135 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.457156 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.457169 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.479251 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6b9wz"] Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.479732 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:52 crc kubenswrapper[4958]: E1201 09:59:52.479815 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.493609 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.510408 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.529304 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.547492 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.561385 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.561450 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.561461 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.561481 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.561507 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.564050 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.573734 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.573773 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj978\" (UniqueName: \"kubernetes.io/projected/987c6a26-52be-40a5-b9cc-456d9731436f-kube-api-access-xj978\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.576260 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.590291 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.601458 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.615485 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.630511 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.641816 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.657042 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.664344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.664406 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.664420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.664440 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.664455 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.670975 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.674631 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.674671 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj978\" (UniqueName: \"kubernetes.io/projected/987c6a26-52be-40a5-b9cc-456d9731436f-kube-api-access-xj978\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:52 crc kubenswrapper[4958]: E1201 09:59:52.674797 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:52 crc kubenswrapper[4958]: E1201 09:59:52.674913 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 09:59:53.174884289 +0000 UTC m=+40.683673326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.688816 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.692996 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj978\" (UniqueName: \"kubernetes.io/projected/987c6a26-52be-40a5-b9cc-456d9731436f-kube-api-access-xj978\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.702928 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.717897 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:52Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.767169 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.767225 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.767236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.767261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.767281 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.870208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.870261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.870274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.870294 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.870307 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.973899 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.974006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.974034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.974068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:52 crc kubenswrapper[4958]: I1201 09:59:52.974089 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:52Z","lastTransitionTime":"2025-12-01T09:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.077222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.077287 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.077309 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.077338 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.077353 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.180267 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.180365 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.180437 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.180458 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.180485 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.180506 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.180941 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.181054 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 09:59:54.181027063 +0000 UTC m=+41.689816150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.283682 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.283743 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.283758 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.283780 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.283793 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.387293 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.387376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.387404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.387438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.387462 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.490602 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.490634 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.490645 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.490663 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.490673 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.593314 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.593383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.593397 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.593418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.593431 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.698059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.698120 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.698159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.698196 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.698242 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.800278 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.800410 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.800686 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.800759 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.800806 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.800867 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.800905 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.800945 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.823610 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.826589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.826698 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.826729 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.826757 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.826771 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.857517 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.879571 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.900816 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.912272 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.927090 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.930031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.930089 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.930112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.930141 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.930162 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.942522 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.954944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.954993 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.955006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.955024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.955039 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.955881 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.967929 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.969456 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.972122 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.972159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.972176 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.972194 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.972207 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.984330 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: E1201 09:59:53.986566 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.990553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.990605 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.990625 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.990649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.990661 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:53Z","lastTransitionTime":"2025-12-01T09:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:53 crc kubenswrapper[4958]: I1201 09:59:53.997885 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: E1201 09:59:54.002701 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.006414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.006447 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.006462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.006483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.006496 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.013429 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: E1201 09:59:54.019839 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.026876 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.026937 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.026954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.026978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.026992 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.031390 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.046294 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: E1201 09:59:54.046587 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: E1201 09:59:54.046856 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.049315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.049344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.049356 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.049383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.049400 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.061164 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.093414 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.152897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.152949 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.152962 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.152983 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.152996 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.197889 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:54 crc kubenswrapper[4958]: E1201 09:59:54.198146 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:54 crc kubenswrapper[4958]: E1201 09:59:54.198229 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 09:59:56.198204602 +0000 UTC m=+43.706993639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.255772 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.255828 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.255859 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.255883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.255905 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.359111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.359164 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.359181 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.359202 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.359219 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.449262 4958 generic.go:334] "Generic (PLEG): container finished" podID="00cc61ff-219a-40e4-a0c3-360c456c57f9" containerID="2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4" exitCode=0 Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.449410 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerDied","Data":"2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.452824 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" event={"ID":"1f88031e-1c6c-4d5a-9648-a64ec5c5147f","Type":"ContainerStarted","Data":"a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.452901 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" event={"ID":"1f88031e-1c6c-4d5a-9648-a64ec5c5147f","Type":"ContainerStarted","Data":"34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.456151 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.465948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.467205 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.470968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.471023 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.471033 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.471058 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.471068 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.486371 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.499227 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.520463 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.535381 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.549673 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.566666 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.573521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.573562 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.573572 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.573593 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.573606 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.584407 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.601649 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.621912 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.635193 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.650903 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.663117 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.676057 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.676111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.676124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.676145 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.676158 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.682997 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.698623 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.722287 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.749304 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.761152 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.771008 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.779004 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.779065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.779079 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.779104 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.779116 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.788508 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.803231 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.815710 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.828415 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.842324 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.856202 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.871291 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.882389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.882438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.882452 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.882471 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.882484 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.887690 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.904546 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.918575 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.933312 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.952304 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.978566 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.985370 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.985402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.985413 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.985438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.985449 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:54Z","lastTransitionTime":"2025-12-01T09:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:54 crc kubenswrapper[4958]: I1201 09:59:54.992485 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:54Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.090143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.090200 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.090212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.090232 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.090245 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.192932 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.193008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.193025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.193047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.193060 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.297460 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.298122 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.298176 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.298212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.298235 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.402060 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.402116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.402128 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.402149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.402163 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.474280 4958 generic.go:334] "Generic (PLEG): container finished" podID="00cc61ff-219a-40e4-a0c3-360c456c57f9" containerID="fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd" exitCode=0 Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.474369 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerDied","Data":"fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.505212 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.506385 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.506425 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.506436 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.506453 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.506464 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.534062 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.562970 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.583898 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.601083 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.610088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.610352 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.610370 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.610393 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.610407 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.624939 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.637050 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.653068 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.664671 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.678109 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.689709 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.701069 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.711813 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.713658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.713733 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.713759 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.713802 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.713820 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.723983 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.734865 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.744705 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.797263 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.797313 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.797351 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.797451 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:55 crc kubenswrapper[4958]: E1201 09:59:55.797492 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 09:59:55 crc kubenswrapper[4958]: E1201 09:59:55.797619 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:55 crc kubenswrapper[4958]: E1201 09:59:55.797700 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:55 crc kubenswrapper[4958]: E1201 09:59:55.797806 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.817147 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.817192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.817202 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.817220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.817230 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.925025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.925104 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.925126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.925154 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:55 crc kubenswrapper[4958]: I1201 09:59:55.925174 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:55Z","lastTransitionTime":"2025-12-01T09:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.027956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.028019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.028034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.028056 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.028069 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.130723 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.130781 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.130794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.130815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.130827 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.220954 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:56 crc kubenswrapper[4958]: E1201 09:59:56.221212 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:56 crc kubenswrapper[4958]: E1201 09:59:56.221338 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 10:00:00.221311564 +0000 UTC m=+47.730100601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.234194 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.234259 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.234273 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.234301 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.234315 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.337534 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.337592 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.337607 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.337630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.337646 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.441154 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.441208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.441225 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.441251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.441268 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.504770 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" event={"ID":"00cc61ff-219a-40e4-a0c3-360c456c57f9","Type":"ContainerStarted","Data":"cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.531593 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.545126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.545211 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.545232 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.545262 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.545281 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.553012 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.568479 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.589975 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.612997 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.649217 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.649287 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.649302 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.649330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.649343 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.655652 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.673954 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.695869 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.712404 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.731656 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.752875 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.752924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.752938 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.752958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.752971 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.756171 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.778738 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.802308 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.815134 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.828633 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.841279 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:56Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.856006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.856045 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.856059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.856079 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.856091 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.958742 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.959149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.959239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.959341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:56 crc kubenswrapper[4958]: I1201 09:59:56.959412 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:56Z","lastTransitionTime":"2025-12-01T09:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.063006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.063048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.063058 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.063075 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.063088 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.165580 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.166017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.166054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.166078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.166091 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.269139 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.269182 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.269193 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.269211 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.269221 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.371669 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.371707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.371716 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.371737 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.371749 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.474338 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.474370 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.474379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.474395 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.474406 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.577552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.577603 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.577616 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.577635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.577645 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.681977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.682767 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.682944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.683057 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.683072 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.786929 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.786988 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.786998 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.787017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.787027 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.797481 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.797481 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:57 crc kubenswrapper[4958]: E1201 09:59:57.797649 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.797503 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.797486 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:57 crc kubenswrapper[4958]: E1201 09:59:57.797733 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:57 crc kubenswrapper[4958]: E1201 09:59:57.797783 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 09:59:57 crc kubenswrapper[4958]: E1201 09:59:57.798058 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.890320 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.890366 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.890378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.890398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.890411 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.994769 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.994833 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.994886 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.994940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:57 crc kubenswrapper[4958]: I1201 09:59:57.994962 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:57Z","lastTransitionTime":"2025-12-01T09:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.099064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.099583 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.099604 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.099631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.099647 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.202960 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.203022 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.203040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.203068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.203087 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.306163 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.306205 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.306218 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.306240 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.306252 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.408451 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.408521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.408534 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.408577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.408592 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.512041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.512078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.512089 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.512105 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.512116 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.515245 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/0.log" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.517692 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8" exitCode=1 Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.517736 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.518725 4958 scope.go:117] "RemoveContainer" containerID="766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.534913 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.555461 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:59:57Z\\\",\\\"message\\\":\\\" 6153 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.884401 6153 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:59:57.884454 6153 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.884744 6153 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.885015 6153 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:57.885048 6153 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:57.885055 6153 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:57.885069 6153 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:57.885127 6153 factory.go:656] Stopping watch factory\\\\nI1201 09:59:57.885147 6153 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 09:59:57.885156 6153 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:57.885164 6153 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:57.885174 6153 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.567496 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.581871 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.596621 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.610232 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.614429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.614485 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.614499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.614523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.614537 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.622867 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.634034 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.646986 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.661712 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.674413 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.690294 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.707573 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.719468 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.719509 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.719518 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.719538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.719552 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.731111 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.751830 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.772636 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:58Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.822630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.822727 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.822743 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.822766 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.822782 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.926035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.926111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.926126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.926148 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:58 crc kubenswrapper[4958]: I1201 09:59:58.926164 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:58Z","lastTransitionTime":"2025-12-01T09:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.030158 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.030212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.030225 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.030246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.030255 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.133553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.133606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.133617 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.133636 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.133647 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.236763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.236820 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.236831 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.236867 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.236884 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.339823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.339902 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.339916 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.339939 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.339950 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.442879 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.442931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.442942 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.442962 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.442973 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.525414 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/0.log" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.527889 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.528471 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.541984 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.545504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.545545 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.545558 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.545577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.545585 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.556405 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.569929 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.584732 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.599384 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.610816 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.625166 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.639645 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.648669 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.648723 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.648734 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.648755 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.648767 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.654308 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.668479 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.685349 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.702080 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.722933 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:59:57Z\\\",\\\"message\\\":\\\" 6153 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.884401 6153 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:59:57.884454 6153 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.884744 6153 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.885015 6153 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:57.885048 6153 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:57.885055 6153 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:57.885069 6153 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:57.885127 6153 factory.go:656] Stopping watch factory\\\\nI1201 09:59:57.885147 6153 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 09:59:57.885156 6153 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:57.885164 6153 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:57.885174 6153 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.734377 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.746351 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.750875 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.750914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.750927 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.750950 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.750963 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.761605 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:59:59Z is after 2025-08-24T17:21:41Z" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.796643 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.796643 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:59:59 crc kubenswrapper[4958]: E1201 09:59:59.796869 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.796668 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.796787 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:59:59 crc kubenswrapper[4958]: E1201 09:59:59.797069 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:59:59 crc kubenswrapper[4958]: E1201 09:59:59.797096 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 09:59:59 crc kubenswrapper[4958]: E1201 09:59:59.797137 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.854053 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.854098 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.854110 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.854129 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.854140 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.957657 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.957704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.957716 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.957735 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:59:59 crc kubenswrapper[4958]: I1201 09:59:59.957749 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:59:59Z","lastTransitionTime":"2025-12-01T09:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.060244 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.060283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.060291 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.060307 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.060317 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.163195 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.163252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.163269 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.163288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.163300 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.266513 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.266571 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.266589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.266611 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.266622 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.297648 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:00 crc kubenswrapper[4958]: E1201 10:00:00.297806 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:00 crc kubenswrapper[4958]: E1201 10:00:00.297932 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 10:00:08.297910821 +0000 UTC m=+55.806699858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.369677 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.369732 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.369743 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.369761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.369775 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.472762 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.472796 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.472805 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.472823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.472833 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.533531 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/1.log" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.534518 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/0.log" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.538027 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a" exitCode=1 Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.538084 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.538130 4958 scope.go:117] "RemoveContainer" containerID="766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.539009 4958 scope.go:117] "RemoveContainer" containerID="368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a" Dec 01 10:00:00 crc kubenswrapper[4958]: E1201 10:00:00.539200 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.558414 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.572528 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.575672 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.576106 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.576254 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.576461 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.576551 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.587274 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.602077 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.615804 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.628008 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.640525 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.654256 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.667284 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.680880 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.680932 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.680946 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.680974 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.680989 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.684643 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.704036 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://766f9cf567a3a8bf1a2819b3d2822e715a47caa77f8488202dbb307dfb2b73f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:59:57Z\\\",\\\"message\\\":\\\" 6153 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.884401 6153 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 09:59:57.884454 6153 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.884744 6153 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:57.885015 6153 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:57.885048 6153 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:57.885055 6153 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:57.885069 6153 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:57.885127 6153 factory.go:656] Stopping watch factory\\\\nI1201 09:59:57.885147 6153 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 09:59:57.885156 6153 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:57.885164 6153 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:57.885174 6153 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.717087 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.731117 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.746074 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.758357 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.772042 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:00Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.784692 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.784736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.784747 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.784768 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.784777 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.888359 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.888431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.888446 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.888469 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.888487 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.991322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.991380 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.991392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.991413 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:00 crc kubenswrapper[4958]: I1201 10:00:00.991426 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:00Z","lastTransitionTime":"2025-12-01T10:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.094946 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.095015 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.095024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.095046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.095056 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.198466 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.198534 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.198552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.198577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.198593 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.301635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.301700 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.301712 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.301733 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.301746 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.404766 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.404811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.404826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.404877 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.404888 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.507149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.507249 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.507264 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.507282 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.507293 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.544041 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/1.log" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.549025 4958 scope.go:117] "RemoveContainer" containerID="368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a" Dec 01 10:00:01 crc kubenswrapper[4958]: E1201 10:00:01.549368 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.565193 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.579905 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.594256 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.610248 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.610508 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.610644 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.610753 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.610822 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.612882 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.625041 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.640245 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.652894 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.666073 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.681829 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.696199 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.708994 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.714312 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.714351 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.714360 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.714378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.714388 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.727491 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.740898 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.756361 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.769393 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.782786 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:01Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.798177 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.798270 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.798209 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.798420 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:01 crc kubenswrapper[4958]: E1201 10:00:01.798585 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:01 crc kubenswrapper[4958]: E1201 10:00:01.798707 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:01 crc kubenswrapper[4958]: E1201 10:00:01.798806 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:01 crc kubenswrapper[4958]: E1201 10:00:01.798895 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.817051 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.817101 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.817112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.817130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.817144 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.919731 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.920084 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.920150 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.920230 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:01 crc kubenswrapper[4958]: I1201 10:00:01.920292 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:01Z","lastTransitionTime":"2025-12-01T10:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.023275 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.023343 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.023363 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.023390 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.023410 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.126506 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.126549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.126557 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.126575 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.126585 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.230023 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.230084 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.230101 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.230126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.230142 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.334016 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.334082 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.334103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.334126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.334141 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.437920 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.438011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.438035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.438073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.438101 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.540941 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.541011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.541025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.541043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.541053 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.643889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.643944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.643963 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.643988 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.644003 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.746811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.746967 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.746983 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.747028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.747093 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.850037 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.850522 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.850538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.850560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.850574 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.953378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.953420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.953429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.953445 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:02 crc kubenswrapper[4958]: I1201 10:00:02.953454 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:02Z","lastTransitionTime":"2025-12-01T10:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.056199 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.056276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.056295 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.056323 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.056342 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.159539 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.159603 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.159617 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.159638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.159650 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.262477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.262515 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.262525 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.262542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.262552 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.366574 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.366658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.366695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.366728 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.366749 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.470716 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.470782 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.470797 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.470820 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.470836 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.573966 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.574015 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.574028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.574049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.574061 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.676987 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.677031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.677041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.677060 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.677071 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.780598 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.780658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.780671 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.780691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.780705 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.797540 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:03 crc kubenswrapper[4958]: E1201 10:00:03.797693 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.797771 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:03 crc kubenswrapper[4958]: E1201 10:00:03.797828 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.798282 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:03 crc kubenswrapper[4958]: E1201 10:00:03.798362 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.798506 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:03 crc kubenswrapper[4958]: E1201 10:00:03.798578 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.820319 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.839970 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.854287 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.870876 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.871812 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.883021 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.883074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.883086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.883107 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.883120 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.886711 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.898754 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.911395 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.926038 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.948428 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.962339 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.979163 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.985570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.985599 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.985608 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.985627 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.985636 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:03Z","lastTransitionTime":"2025-12-01T10:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:03 crc kubenswrapper[4958]: I1201 10:00:03.998439 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:03Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.013430 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.025919 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.037127 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.052132 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.063819 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.075882 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.087765 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.088049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.088159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.088225 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.088281 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.092146 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.105436 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.117566 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.130912 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.142467 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.158745 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.178329 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.184020 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.184055 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.184065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.184081 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.184092 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.199109 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: E1201 10:00:04.218271 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.224271 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.224330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.224343 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.224372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.224386 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.237908 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: E1201 10:00:04.247913 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.256023 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.258981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.259037 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.259052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.259082 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.259096 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.270725 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: E1201 10:00:04.271726 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.275464 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.275622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.275741 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.275865 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.275952 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.285506 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: E1201 10:00:04.289603 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.293323 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.293428 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.293508 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.293592 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.293654 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.300064 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: E1201 10:00:04.304401 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: E1201 10:00:04.304675 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.306531 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.306658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.306968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.307073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.307184 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.311442 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:04Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.410761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.410833 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.410894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.410923 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.410941 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.513929 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.513980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.513996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.514019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.514031 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.622637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.622694 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.622704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.622722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.622732 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.726526 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.726585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.726602 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.726628 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.726647 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.829953 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.830032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.830064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.830101 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.830126 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.933120 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.933180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.933192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.933213 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:04 crc kubenswrapper[4958]: I1201 10:00:04.933225 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:04Z","lastTransitionTime":"2025-12-01T10:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.036578 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.036635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.036649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.036666 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.036680 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.140457 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.140496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.140507 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.140523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.140533 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.243471 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.243519 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.243535 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.243555 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.243567 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.347222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.347283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.347300 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.347322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.347336 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.451006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.451054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.451068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.451087 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.451099 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.554337 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.554403 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.554429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.554455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.554468 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.600804 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.611581 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.618474 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.633014 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.646755 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.657704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.658334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.658520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.658629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.658693 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.665660 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.683535 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.698773 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.714079 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.732407 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.755436 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.760752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.760968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.761097 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.761210 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.761314 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.768647 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.780351 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.794150 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.796438 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.796581 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.796809 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.796865 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:05 crc kubenswrapper[4958]: E1201 10:00:05.796822 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:05 crc kubenswrapper[4958]: E1201 10:00:05.796967 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:05 crc kubenswrapper[4958]: E1201 10:00:05.797038 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:05 crc kubenswrapper[4958]: E1201 10:00:05.797266 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.811524 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.823558 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.844352 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.859671 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:05Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.863889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.864003 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.864097 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.864227 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.864325 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.968486 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.969018 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.969127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.969328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:05 crc kubenswrapper[4958]: I1201 10:00:05.969470 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:05Z","lastTransitionTime":"2025-12-01T10:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.072391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.072434 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.072447 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.072473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.072485 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.175381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.175494 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.175512 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.175537 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.175559 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.278733 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.278795 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.278813 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.278886 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.278925 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.382417 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.382925 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.383078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.383223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.383355 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.486791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.487073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.487086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.487108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.487121 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.590679 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.590720 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.590730 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.590747 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.590756 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.693680 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.693736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.693752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.693770 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.693782 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.796824 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.796935 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.796958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.796987 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.797006 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.900583 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.900646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.900665 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.900691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:06 crc kubenswrapper[4958]: I1201 10:00:06.900703 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:06Z","lastTransitionTime":"2025-12-01T10:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.003801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.003863 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.003877 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.003896 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.003908 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.106497 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.106553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.106569 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.106592 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.106606 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.210303 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.210402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.210425 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.210453 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.210481 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.314138 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.314189 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.314200 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.314219 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.314233 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.417524 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.417582 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.417594 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.417612 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.417623 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.520664 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.521051 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.521120 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.521187 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.521248 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.578645 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.579013 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:00:39.578964425 +0000 UTC m=+87.087753472 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.623510 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.623869 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.623960 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.624051 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.624189 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.727455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.727515 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.727533 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.727552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.727563 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.781503 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.781589 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.781742 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.781774 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.781928 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:39.781818376 +0000 UTC m=+87.290607453 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.781977 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:39.78195513 +0000 UTC m=+87.290744207 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.797375 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.797436 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.797376 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.797581 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.797570 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.797681 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.797771 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.797821 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.830804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.830882 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.830897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.830916 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.830951 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.882619 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.882773 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.882901 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.882928 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.882944 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.883007 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.883042 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.883060 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.883018 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:39.883001629 +0000 UTC m=+87.391790666 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:07 crc kubenswrapper[4958]: E1201 10:00:07.883172 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:00:39.883132753 +0000 UTC m=+87.391921810 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.934102 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.934156 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.934166 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.934228 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:07 crc kubenswrapper[4958]: I1201 10:00:07.934241 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:07Z","lastTransitionTime":"2025-12-01T10:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.038328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.038378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.038387 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.038405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.038422 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.141722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.141774 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.141785 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.141804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.141815 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.246223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.246313 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.246334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.246362 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.246382 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.351123 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.351203 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.351223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.351252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.351274 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.389209 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:08 crc kubenswrapper[4958]: E1201 10:00:08.389838 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:08 crc kubenswrapper[4958]: E1201 10:00:08.390098 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 10:00:24.390067679 +0000 UTC m=+71.898856716 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.455183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.455234 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.455250 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.455270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.455283 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.558175 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.558228 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.558262 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.558281 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.558295 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.662750 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.662818 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.662835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.662885 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.662899 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.766112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.766169 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.766189 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.766217 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.766235 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.868913 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.869017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.869040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.869067 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.869088 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.972056 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.972108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.972118 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.972138 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:08 crc kubenswrapper[4958]: I1201 10:00:08.972152 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:08Z","lastTransitionTime":"2025-12-01T10:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.076048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.076133 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.076155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.076183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.076203 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.180028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.180089 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.180105 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.180130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.180146 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.282442 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.282490 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.282500 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.282518 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.282528 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.385478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.385552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.385568 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.385594 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.385608 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.489335 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.489403 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.489420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.489448 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.489465 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.593037 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.593113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.593141 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.593170 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.593188 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.696422 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.696472 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.696482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.696499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.696510 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.796965 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.797045 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.797077 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:09 crc kubenswrapper[4958]: E1201 10:00:09.797196 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.797247 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:09 crc kubenswrapper[4958]: E1201 10:00:09.797305 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:09 crc kubenswrapper[4958]: E1201 10:00:09.797435 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:09 crc kubenswrapper[4958]: E1201 10:00:09.797619 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.799608 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.799795 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.800627 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.800664 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.800678 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.904454 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.904522 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.904542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.904568 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:09 crc kubenswrapper[4958]: I1201 10:00:09.904583 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:09Z","lastTransitionTime":"2025-12-01T10:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.008472 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.008581 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.008647 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.008682 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.008705 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.112165 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.112247 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.112273 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.112302 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.112326 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.215788 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.215878 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.215892 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.215912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.215925 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.319639 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.319712 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.319731 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.319754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.319770 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.423278 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.423334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.423347 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.423367 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.423380 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.526970 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.527041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.527061 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.527091 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.527111 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.630662 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.630715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.630729 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.630749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.630760 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.733545 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.734073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.734209 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.734347 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.734461 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.837562 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.838257 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.838457 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.838619 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.838999 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.942640 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.942707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.942718 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.942739 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:10 crc kubenswrapper[4958]: I1201 10:00:10.942754 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:10Z","lastTransitionTime":"2025-12-01T10:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.047004 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.047092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.047103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.047124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.047134 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.150279 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.150315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.150324 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.150340 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.150348 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.253646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.253694 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.253707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.253730 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.253742 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.357476 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.357544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.357569 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.357600 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.357623 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.460509 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.460593 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.460604 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.460629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.460642 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.564722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.564892 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.564910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.564940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.564957 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.668664 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.668809 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.668822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.668873 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.668888 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.771228 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.771280 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.771292 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.771316 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.771335 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.797008 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.797059 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.797103 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.797234 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:11 crc kubenswrapper[4958]: E1201 10:00:11.797223 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:11 crc kubenswrapper[4958]: E1201 10:00:11.797359 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:11 crc kubenswrapper[4958]: E1201 10:00:11.797568 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:11 crc kubenswrapper[4958]: E1201 10:00:11.797685 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.874960 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.875032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.875050 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.875074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.875089 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.979622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.979709 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.979740 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.979771 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:11 crc kubenswrapper[4958]: I1201 10:00:11.979789 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:11Z","lastTransitionTime":"2025-12-01T10:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.083157 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.083538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.083661 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.083784 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.083893 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.187513 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.187587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.187603 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.187630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.187647 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.290269 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.290317 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.290328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.290350 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.290369 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.394299 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.394364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.394379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.394403 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.394418 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.497264 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.497335 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.497345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.497363 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.497373 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.602079 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.602127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.602140 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.602163 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.602176 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.705738 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.705801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.705817 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.705872 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.705889 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.797940 4958 scope.go:117] "RemoveContainer" containerID="368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.809758 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.809940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.809973 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.810016 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.810043 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.912564 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.912605 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.912617 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.912637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:12 crc kubenswrapper[4958]: I1201 10:00:12.912649 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:12Z","lastTransitionTime":"2025-12-01T10:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.015924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.015970 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.015981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.016001 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.016011 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.118925 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.119000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.119012 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.119032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.119043 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.222346 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.222388 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.222404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.222441 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.222453 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.325562 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.325630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.325641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.325662 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.325676 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.428712 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.428785 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.428803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.428832 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.428878 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.533161 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.533245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.533291 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.533319 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.533336 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.592255 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/1.log" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.594793 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.595400 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.609097 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.622621 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.635980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.636032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.636043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.636063 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.636075 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.637429 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.652376 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.674237 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.695228 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.719738 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.733107 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.740966 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.741035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.741046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.741065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.741096 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.751473 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.766648 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.780901 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.793993 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.796946 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.797088 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:13 crc kubenswrapper[4958]: E1201 10:00:13.797105 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.797257 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:13 crc kubenswrapper[4958]: E1201 10:00:13.797287 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.796970 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:13 crc kubenswrapper[4958]: E1201 10:00:13.797438 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:13 crc kubenswrapper[4958]: E1201 10:00:13.797591 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.809542 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.822208 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.836666 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.843954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.844017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.844030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.844054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.844067 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.852045 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.870281 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.888859 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.901647 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.913546 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.925465 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.935472 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.946288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.946347 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.946362 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.946384 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.946399 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:13Z","lastTransitionTime":"2025-12-01T10:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.946773 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.959293 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.969633 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:13 crc kubenswrapper[4958]: I1201 10:00:13.983579 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.002071 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:13Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.012411 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.024357 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.037520 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.047609 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.048881 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.048927 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.048943 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.048966 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.048986 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.059674 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.071285 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.081661 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.151889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.151943 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.151956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.151975 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.151988 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.254866 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.254923 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.254936 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.254955 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.255257 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.362064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.362598 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.362614 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.362638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.362654 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.364345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.364402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.364416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.364439 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.364452 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.376476 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.381031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.381086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.381100 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.381121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.381133 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.396100 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.401763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.401789 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.401801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.401823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.401837 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.421142 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.425961 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.426026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.426038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.426059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.426070 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.438610 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.442580 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.442623 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.442637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.442659 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.442671 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.456930 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.457128 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.466336 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.466392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.466405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.466425 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.466442 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.568589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.568658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.568671 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.568692 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.568706 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.601161 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/2.log" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.601975 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/1.log" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.605587 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b" exitCode=1 Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.605654 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.605716 4958 scope.go:117] "RemoveContainer" containerID="368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.606558 4958 scope.go:117] "RemoveContainer" containerID="362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b" Dec 01 10:00:14 crc kubenswrapper[4958]: E1201 10:00:14.606779 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.625877 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.637324 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.651217 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.666250 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.672035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.672091 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.672101 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.672119 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.672130 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.681210 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.695732 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.710211 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.724601 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.738783 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.752494 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.768415 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.775381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.775455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.775474 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.775505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.775520 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.787426 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.801755 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.816624 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.836753 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368dd349bebcdf290bfbe18ee512fa940e760588ba6ad4e0bef511cc25a6056a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:00Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI1201 09:59:59.819741 6444 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:59:59.819891 6444 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:59:59.820776 6444 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:59:59.820805 6444 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:59:59.820903 6444 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:59:59.820958 6444 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:59:59.820972 6444 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:59:59.821021 6444 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:59:59.821045 6444 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:59:59.821077 6444 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:59:59.821081 6444 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:59:59.821113 6444 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 09:59:59.821164 6444 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:59:59.821200 6444 factory.go:656] Stopping watch factory\\\\nI1201 09:59:59.821231 6444 ovnkube.go:599] Stopped ovnkube\\\\nI1201 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.851093 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.867540 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:14Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.878714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.878763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.878773 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.878790 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.878802 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.982083 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.982155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.982174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.982204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:14 crc kubenswrapper[4958]: I1201 10:00:14.982222 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:14Z","lastTransitionTime":"2025-12-01T10:00:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.085549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.085608 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.085621 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.085639 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.085652 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.189731 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.189794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.189812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.189835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.189869 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.294048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.294104 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.294125 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.294148 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.294164 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.397577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.397620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.397630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.397653 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.397662 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.500409 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.500456 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.500465 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.500486 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.500496 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.602880 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.602926 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.602936 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.602956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.602967 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.610488 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/2.log" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.613818 4958 scope.go:117] "RemoveContainer" containerID="362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b" Dec 01 10:00:15 crc kubenswrapper[4958]: E1201 10:00:15.614088 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.625413 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.641288 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.654527 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.670114 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.686302 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.701242 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.708521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.708563 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.708575 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.708594 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.708608 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.715305 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.728923 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.743672 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.754828 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.772557 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.792382 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.797500 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.797559 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.797595 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:15 crc kubenswrapper[4958]: E1201 10:00:15.797689 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:15 crc kubenswrapper[4958]: E1201 10:00:15.797921 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:15 crc kubenswrapper[4958]: E1201 10:00:15.797986 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.798035 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:15 crc kubenswrapper[4958]: E1201 10:00:15.798167 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.804308 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.811039 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.811122 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.811136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.811155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.811199 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.815034 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.824756 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.839354 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.851273 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:15Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.914616 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.914673 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.914688 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.914723 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:15 crc kubenswrapper[4958]: I1201 10:00:15.914736 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:15Z","lastTransitionTime":"2025-12-01T10:00:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.018125 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.018189 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.018204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.018227 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.018243 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.120714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.120762 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.120774 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.120794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.120808 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.223670 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.223714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.223724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.223742 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.223753 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.326708 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.326775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.326785 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.326806 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.326817 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.430229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.430795 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.430811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.430833 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.430866 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.533707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.533763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.533774 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.533793 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.533804 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.636774 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.636812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.636825 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.636873 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.636887 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.739478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.739511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.739521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.739537 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.739547 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.842302 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.842344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.842355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.842372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.842384 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.945258 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.945294 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.945306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.945322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:16 crc kubenswrapper[4958]: I1201 10:00:16.945331 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:16Z","lastTransitionTime":"2025-12-01T10:00:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.048047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.048121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.048132 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.048376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.048392 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.151251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.151422 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.151494 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.151534 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.151563 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.255716 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.255774 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.255791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.255815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.255827 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.358389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.358450 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.358461 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.358480 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.358493 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.461624 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.461684 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.461697 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.461722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.461736 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.565258 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.565337 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.565353 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.565383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.565400 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.668336 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.668400 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.668411 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.668429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.668485 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.771333 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.771394 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.771418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.771444 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.771459 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.796871 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.796995 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:17 crc kubenswrapper[4958]: E1201 10:00:17.797036 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.796871 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.797153 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:17 crc kubenswrapper[4958]: E1201 10:00:17.797208 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:17 crc kubenswrapper[4958]: E1201 10:00:17.797449 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:17 crc kubenswrapper[4958]: E1201 10:00:17.797606 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.874654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.874714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.874734 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.874754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.874769 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.977524 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.977570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.977582 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.977603 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:17 crc kubenswrapper[4958]: I1201 10:00:17.977614 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:17Z","lastTransitionTime":"2025-12-01T10:00:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.081290 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.081355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.081369 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.081391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.081405 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.184812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.184897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.184914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.184933 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.184945 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.288239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.288298 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.288309 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.288328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.288368 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.391456 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.391508 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.391524 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.391544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.391554 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.494790 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.494974 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.494999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.495034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.495063 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.598721 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.598786 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.598802 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.598823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.598855 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.701998 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.702072 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.702089 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.702112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.702128 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.806031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.806223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.806261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.806297 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.806319 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.909331 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.909404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.909414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.909431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:18 crc kubenswrapper[4958]: I1201 10:00:18.909441 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:18Z","lastTransitionTime":"2025-12-01T10:00:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.012649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.012714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.012725 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.012744 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.012759 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.115049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.115124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.115140 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.115182 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.115195 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.218438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.218514 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.218531 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.218553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.218563 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.322274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.322603 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.322620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.322641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.322654 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.425283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.425328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.425340 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.425364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.425375 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.528613 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.528666 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.528680 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.528703 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.528716 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.631674 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.631741 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.631754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.631774 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.631787 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.734143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.734208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.734227 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.734251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.734266 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.797604 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.797693 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.797621 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:19 crc kubenswrapper[4958]: E1201 10:00:19.797943 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:19 crc kubenswrapper[4958]: E1201 10:00:19.798058 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:19 crc kubenswrapper[4958]: E1201 10:00:19.798192 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.798444 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:19 crc kubenswrapper[4958]: E1201 10:00:19.798600 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.836970 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.837030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.837040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.837059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.837072 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.940940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.941315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.941465 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.941587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:19 crc kubenswrapper[4958]: I1201 10:00:19.941734 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:19Z","lastTransitionTime":"2025-12-01T10:00:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.044813 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.044894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.044908 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.044931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.044948 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.148074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.148467 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.148560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.148658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.148749 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.251635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.251675 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.251684 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.251702 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.251714 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.354555 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.354606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.354624 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.354652 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.354668 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.458007 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.458340 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.458416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.458495 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.458613 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.561411 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.561455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.561465 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.561481 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.561491 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.664943 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.664979 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.664988 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.665004 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.665013 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.769115 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.769474 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.769559 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.769637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.769700 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.872931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.872975 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.872987 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.873005 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.873015 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.975547 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.975611 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.975629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.975654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:20 crc kubenswrapper[4958]: I1201 10:00:20.975674 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:20Z","lastTransitionTime":"2025-12-01T10:00:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.079006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.079070 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.079085 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.079113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.079129 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.182803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.182890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.182904 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.182928 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.182944 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.285760 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.286501 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.286584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.286690 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.286759 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.389474 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.389897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.389983 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.390071 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.390137 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.493435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.493505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.493520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.493542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.493556 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.596652 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.596691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.596699 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.596713 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.596723 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.699988 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.700047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.700060 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.700080 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.700094 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.797144 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.797240 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.797312 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.797198 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:21 crc kubenswrapper[4958]: E1201 10:00:21.797445 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:21 crc kubenswrapper[4958]: E1201 10:00:21.797379 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:21 crc kubenswrapper[4958]: E1201 10:00:21.797627 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:21 crc kubenswrapper[4958]: E1201 10:00:21.797737 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.803209 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.803284 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.803299 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.803317 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.803332 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.906764 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.907194 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.907272 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.907341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:21 crc kubenswrapper[4958]: I1201 10:00:21.907410 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:21Z","lastTransitionTime":"2025-12-01T10:00:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.010492 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.010555 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.010579 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.010606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.010616 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.113454 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.113520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.113535 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.113562 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.113572 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.216501 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.217033 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.217177 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.217312 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.217433 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.320150 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.320205 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.320220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.320238 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.320251 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.422280 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.422327 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.422342 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.422364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.422375 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.524982 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.525365 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.525467 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.525557 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.525640 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.629095 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.629459 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.629554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.629687 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.629755 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.733025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.733478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.733589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.733690 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.733838 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.837237 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.837332 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.837344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.837363 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.837376 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.940367 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.940437 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.940449 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.940473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:22 crc kubenswrapper[4958]: I1201 10:00:22.940487 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:22Z","lastTransitionTime":"2025-12-01T10:00:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.043795 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.043874 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.043888 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.043910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.043924 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.146775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.146822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.146837 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.146881 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.146893 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.250824 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.250897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.250908 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.250927 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.250937 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.353710 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.353758 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.353767 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.353799 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.353809 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.457424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.457487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.457500 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.457523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.457535 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.561159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.561211 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.561226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.561258 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.561269 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.664398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.664467 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.664483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.664504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.664515 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.767912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.767970 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.767981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.768000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.768013 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.796874 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.796922 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.796921 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.796872 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:23 crc kubenswrapper[4958]: E1201 10:00:23.797047 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:23 crc kubenswrapper[4958]: E1201 10:00:23.797135 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:23 crc kubenswrapper[4958]: E1201 10:00:23.797207 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:23 crc kubenswrapper[4958]: E1201 10:00:23.797265 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.814454 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.828176 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.843862 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.858608 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.873026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.873074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.873085 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.873103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.873115 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.873565 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.894162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.912113 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.924037 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.934937 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.947257 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.958074 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.972080 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.982744 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.982794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.982805 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.982828 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.982856 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:23Z","lastTransitionTime":"2025-12-01T10:00:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.987275 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:23 crc kubenswrapper[4958]: I1201 10:00:23.999765 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:23Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.015182 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.032033 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.042893 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.086056 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.086105 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.086115 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.086134 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.086146 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.190074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.193168 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.193293 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.193377 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.193476 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.297940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.298008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.298022 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.298045 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.298067 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.401331 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.401389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.401405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.401424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.401440 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.476588 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.476803 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.476917 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 10:00:56.476893194 +0000 UTC m=+103.985682241 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.504772 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.504887 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.504910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.504941 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.504953 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.607801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.607866 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.607877 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.607895 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.607906 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.710893 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.710987 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.711002 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.711024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.711037 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.814310 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.814365 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.814377 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.814392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.814405 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.855035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.855110 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.855125 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.855151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.855169 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.869044 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.875424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.875499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.875515 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.875671 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.875703 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.890624 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.895083 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.895147 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.895158 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.895177 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.895188 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.909793 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.915058 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.915111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.915123 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.915144 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.915153 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.928748 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.933388 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.933443 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.933454 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.933473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.933488 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.946262 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:24Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:24 crc kubenswrapper[4958]: E1201 10:00:24.946439 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.948575 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.948645 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.948657 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.948676 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:24 crc kubenswrapper[4958]: I1201 10:00:24.948690 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:24Z","lastTransitionTime":"2025-12-01T10:00:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.052114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.052178 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.052190 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.052214 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.052227 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.155131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.155182 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.155197 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.155217 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.155230 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.257954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.258008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.258024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.258047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.258060 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.360547 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.360591 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.360602 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.360621 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.360633 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.464143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.464187 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.464200 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.464222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.464235 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.567164 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.567221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.567234 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.567255 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.567268 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.669374 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.669428 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.669443 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.669462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.669478 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.772063 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.772129 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.772142 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.772164 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.772174 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.797292 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.797399 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.797394 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.797519 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:25 crc kubenswrapper[4958]: E1201 10:00:25.797677 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:25 crc kubenswrapper[4958]: E1201 10:00:25.797881 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:25 crc kubenswrapper[4958]: E1201 10:00:25.798029 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:25 crc kubenswrapper[4958]: E1201 10:00:25.798189 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.878820 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.878924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.878950 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.878980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.878995 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.981864 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.981937 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.981947 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.981965 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:25 crc kubenswrapper[4958]: I1201 10:00:25.981976 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:25Z","lastTransitionTime":"2025-12-01T10:00:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.085152 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.085225 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.085240 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.085266 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.085282 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.189414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.189477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.189489 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.189511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.189522 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.292145 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.292209 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.292222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.292246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.292288 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.396878 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.396944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.396959 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.396986 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.396999 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.500529 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.500600 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.500620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.500646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.500664 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.604052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.604172 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.604212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.604240 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.604250 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.707667 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.707736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.707752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.707777 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.707791 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.811211 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.811264 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.811276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.811293 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.811304 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.914653 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.914707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.914717 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.914736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:26 crc kubenswrapper[4958]: I1201 10:00:26.914746 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:26Z","lastTransitionTime":"2025-12-01T10:00:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.017132 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.017196 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.017213 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.017236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.017250 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.120593 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.120654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.120669 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.120689 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.120700 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.223909 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.223968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.223980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.224002 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.224014 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.326749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.326827 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.326838 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.326875 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.326887 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.429805 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.429882 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.429896 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.429914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.429926 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.532670 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.532727 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.532737 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.532761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.532773 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.636046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.636117 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.636130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.636159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.636172 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.738836 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.738950 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.738961 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.738982 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.738994 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.797103 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.797183 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.797209 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.797133 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:27 crc kubenswrapper[4958]: E1201 10:00:27.797314 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:27 crc kubenswrapper[4958]: E1201 10:00:27.797415 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:27 crc kubenswrapper[4958]: E1201 10:00:27.797531 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:27 crc kubenswrapper[4958]: E1201 10:00:27.797627 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.842377 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.842470 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.842485 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.842516 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.842530 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.945633 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.945690 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.945702 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.945721 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:27 crc kubenswrapper[4958]: I1201 10:00:27.945733 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:27Z","lastTransitionTime":"2025-12-01T10:00:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.049067 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.049116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.049128 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.049149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.049161 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.152058 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.152103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.152114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.152133 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.152151 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.255164 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.255221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.255233 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.255249 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.255258 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.358077 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.358131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.358143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.358162 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.358174 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.461698 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.461765 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.461782 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.461803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.461816 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.564551 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.564600 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.564611 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.564635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.564655 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.667715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.667878 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.667902 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.667928 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.667943 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.770724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.770769 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.770780 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.770798 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.770808 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.799273 4958 scope.go:117] "RemoveContainer" containerID="362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b" Dec 01 10:00:28 crc kubenswrapper[4958]: E1201 10:00:28.799953 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.874060 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.874109 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.874120 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.874137 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.874149 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.976964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.977026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.977044 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.977069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:28 crc kubenswrapper[4958]: I1201 10:00:28.977083 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:28Z","lastTransitionTime":"2025-12-01T10:00:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.079705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.079771 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.079782 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.079801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.079812 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.182531 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.182948 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.182970 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.182991 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.183006 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.286139 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.286198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.286212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.286251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.286263 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.389184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.389256 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.389270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.389293 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.389307 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.493404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.493445 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.493457 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.493475 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.493486 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.600914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.600969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.600983 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.601004 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.601018 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.705049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.705089 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.705100 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.705118 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.705130 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.796690 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.796758 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.796715 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:29 crc kubenswrapper[4958]: E1201 10:00:29.796855 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.796716 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:29 crc kubenswrapper[4958]: E1201 10:00:29.797004 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:29 crc kubenswrapper[4958]: E1201 10:00:29.797065 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:29 crc kubenswrapper[4958]: E1201 10:00:29.797192 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.806794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.807620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.807698 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.807735 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.807755 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.910659 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.910727 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.910742 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.910763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:29 crc kubenswrapper[4958]: I1201 10:00:29.910779 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:29Z","lastTransitionTime":"2025-12-01T10:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.014389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.014500 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.014531 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.014578 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.014610 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.118257 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.118317 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.118329 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.118348 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.118360 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.221981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.222028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.222038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.222058 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.222069 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.325283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.325345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.325363 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.325383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.325395 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.427699 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.427744 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.427761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.427779 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.427789 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.531899 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.531960 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.531973 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.531995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.532008 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.635345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.635391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.635404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.635421 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.635431 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.738579 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.738624 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.738633 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.738650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.738661 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.842285 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.842342 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.842354 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.842374 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.842386 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.945775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.945839 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.945877 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.945907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:30 crc kubenswrapper[4958]: I1201 10:00:30.945930 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:30Z","lastTransitionTime":"2025-12-01T10:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.049419 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.049503 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.049519 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.049551 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.049572 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.152429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.152487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.152498 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.152516 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.152528 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.254907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.254964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.254980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.255001 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.255016 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.358320 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.358361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.358370 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.358384 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.358394 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.460736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.460793 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.460804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.460823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.460833 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.563874 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.563935 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.563946 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.563964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.563974 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.666808 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.666898 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.666920 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.666947 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.666967 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.769836 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.769922 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.769944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.769971 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.769991 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.797155 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.797155 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.797184 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.797628 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:31 crc kubenswrapper[4958]: E1201 10:00:31.797640 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:31 crc kubenswrapper[4958]: E1201 10:00:31.797815 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:31 crc kubenswrapper[4958]: E1201 10:00:31.797954 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:31 crc kubenswrapper[4958]: E1201 10:00:31.798149 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.812249 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.872632 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.872728 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.872753 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.872786 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.872807 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.975826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.976255 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.976333 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.976409 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:31 crc kubenswrapper[4958]: I1201 10:00:31.976489 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:31Z","lastTransitionTime":"2025-12-01T10:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.080094 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.080633 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.080806 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.081014 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.081160 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.183725 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.183771 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.183782 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.183803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.183815 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.287544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.288026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.288136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.288264 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.288361 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.391131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.391504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.391604 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.391706 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.391858 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.494484 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.494531 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.494542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.494559 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.494569 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.598165 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.598251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.598274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.598309 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.598331 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.701499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.701540 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.701552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.701575 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.701591 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.805093 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.805149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.805163 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.805183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.805196 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.908652 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.908703 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.908713 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.908732 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:32 crc kubenswrapper[4958]: I1201 10:00:32.908742 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:32Z","lastTransitionTime":"2025-12-01T10:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.011236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.011297 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.011311 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.011333 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.011350 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.114169 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.114217 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.114228 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.114249 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.114261 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.216792 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.216894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.216912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.216936 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.216953 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.320180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.320253 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.320269 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.320289 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.320301 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.423412 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.423470 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.423483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.423501 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.423512 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.526643 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.526712 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.526729 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.526756 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.526775 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.629113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.629148 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.629158 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.629174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.629183 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.731980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.732036 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.732050 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.732079 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.732098 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.798308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:33 crc kubenswrapper[4958]: E1201 10:00:33.798530 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.798637 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.798808 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:33 crc kubenswrapper[4958]: E1201 10:00:33.798802 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:33 crc kubenswrapper[4958]: E1201 10:00:33.798890 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.798946 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:33 crc kubenswrapper[4958]: E1201 10:00:33.799026 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.813806 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.826532 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.835115 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.835162 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.835173 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.835191 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.835201 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.845638 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.860644 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.877458 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.892908 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.905562 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.921021 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.932496 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.939500 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.939540 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.939554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.939574 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.939589 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:33Z","lastTransitionTime":"2025-12-01T10:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.945416 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.955795 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.969248 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.990352 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:33 crc kubenswrapper[4958]: I1201 10:00:33.999600 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:33Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.009058 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.019645 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.029907 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.041628 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.041683 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.041696 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.041716 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.041727 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.042536 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.143713 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.143780 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.143798 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.143824 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.143836 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.246816 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.246897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.246914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.246934 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.246949 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.350529 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.350574 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.350585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.350606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.350616 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.453418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.453451 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.453460 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.453475 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.453483 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.555739 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.555791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.555803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.555864 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.555876 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.658081 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.658152 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.658168 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.658191 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.658205 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.679684 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/0.log" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.679777 4958 generic.go:334] "Generic (PLEG): container finished" podID="46276a58-9607-4a8a-bcfc-ca41ab441ec2" containerID="7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c" exitCode=1 Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.679872 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerDied","Data":"7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.680630 4958 scope.go:117] "RemoveContainer" containerID="7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.699035 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.711897 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.726087 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.740481 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.758594 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.760721 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.760802 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.760833 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.760949 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.761758 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.769582 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.783116 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.800583 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.810323 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.822254 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.833030 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.842671 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.854543 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:33Z\\\",\\\"message\\\":\\\"2025-12-01T09:59:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521\\\\n2025-12-01T09:59:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521 to /host/opt/cni/bin/\\\\n2025-12-01T09:59:48Z [verbose] multus-daemon started\\\\n2025-12-01T09:59:48Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:00:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.867825 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.867887 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.867920 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.867956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.867970 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.874304 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.886006 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.899690 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.910743 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.919148 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:34Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.970749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.970786 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.970815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.970834 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:34 crc kubenswrapper[4958]: I1201 10:00:34.970854 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:34Z","lastTransitionTime":"2025-12-01T10:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.000347 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.000400 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.000418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.000438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.000450 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.020226 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.024827 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.024876 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.024887 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.024905 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.024916 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.036172 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.041057 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.041141 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.041160 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.041695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.041728 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.055083 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.061564 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.061631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.061646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.061672 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.061696 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.074000 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.077942 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.077971 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.077981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.077996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.078004 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.091015 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.091142 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.092954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.092986 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.092999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.093020 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.093032 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.195902 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.195935 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.195945 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.195964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.195973 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.299019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.299065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.299080 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.299108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.299123 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.402307 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.402355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.402368 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.402390 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.402402 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.505781 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.505818 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.505829 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.505869 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.505887 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.609111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.609499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.609511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.609527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.609537 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.685923 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/0.log" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.685992 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerStarted","Data":"a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.703221 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.717210 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.717276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.717296 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.717322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.717355 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.722458 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.737004 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.750225 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.761139 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.774085 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.785619 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.797484 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.798209 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.798410 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.798494 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.798571 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.798603 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.798653 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.798701 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:35 crc kubenswrapper[4958]: E1201 10:00:35.798756 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.812428 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.820142 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.820174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.820182 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.820198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.820209 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.829250 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.839751 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.853197 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.866340 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.877273 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.887829 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.901227 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:33Z\\\",\\\"message\\\":\\\"2025-12-01T09:59:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521\\\\n2025-12-01T09:59:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521 to /host/opt/cni/bin/\\\\n2025-12-01T09:59:48Z [verbose] multus-daemon started\\\\n2025-12-01T09:59:48Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:00:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.911833 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.922289 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:35Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.923220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.923270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.923283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.923302 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:35 crc kubenswrapper[4958]: I1201 10:00:35.923314 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:35Z","lastTransitionTime":"2025-12-01T10:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.026944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.026996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.027011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.027032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.027044 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.130155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.130204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.130216 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.130239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.130252 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.233579 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.233642 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.233654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.233673 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.233704 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.337230 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.337277 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.337288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.337306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.337319 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.440011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.440068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.440086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.440113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.440127 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.543475 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.543572 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.543589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.543631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.543645 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.646129 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.646179 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.646203 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.646223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.646235 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.749584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.749655 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.749678 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.749705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.749724 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.852505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.852549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.852560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.852603 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.852615 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.955646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.955718 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.955732 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.955764 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:36 crc kubenswrapper[4958]: I1201 10:00:36.955781 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:36Z","lastTransitionTime":"2025-12-01T10:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.059429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.059494 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.059508 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.059532 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.059547 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.162626 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.162688 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.162702 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.162720 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.162731 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.265712 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.265789 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.265817 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.265907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.265942 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.370001 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.370084 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.370108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.370138 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.370159 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.474333 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.474435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.474456 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.474486 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.474514 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.579138 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.579270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.579292 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.579326 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.579346 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.682350 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.682410 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.682421 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.682440 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.682454 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.804526 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:37 crc kubenswrapper[4958]: E1201 10:00:37.804721 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.804986 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:37 crc kubenswrapper[4958]: E1201 10:00:37.805061 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.805209 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:37 crc kubenswrapper[4958]: E1201 10:00:37.805265 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.805619 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.805957 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.805991 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.806006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: E1201 10:00:37.805968 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.806025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.806172 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.909220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.909271 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.909283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.909305 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:37 crc kubenswrapper[4958]: I1201 10:00:37.909317 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:37Z","lastTransitionTime":"2025-12-01T10:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.012037 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.012092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.012102 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.012121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.012131 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.114699 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.114752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.114763 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.114783 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.114793 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.217956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.218033 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.218047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.218074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.218088 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.320956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.321027 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.321046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.321071 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.321088 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.425713 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.425810 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.425829 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.425883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.425915 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.529554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.529659 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.529686 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.529722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.529750 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.633405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.633502 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.633527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.633564 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.633587 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.737310 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.737410 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.737447 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.737485 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.737507 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.841344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.841389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.841398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.841419 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.841429 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.943930 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.943995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.944008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.944032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:38 crc kubenswrapper[4958]: I1201 10:00:38.944048 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:38Z","lastTransitionTime":"2025-12-01T10:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.046761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.046822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.046835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.046874 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.046887 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.150522 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.150587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.150601 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.150626 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.150639 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.253119 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.253190 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.253208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.253237 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.253255 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.358246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.358308 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.358328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.358355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.358372 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.462424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.462476 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.462488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.462510 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.462522 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.566084 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.566138 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.566151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.566171 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.566181 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.652533 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.652926 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:43.652800905 +0000 UTC m=+151.161589972 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.668977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.669038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.669052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.669073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.669088 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.772711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.772802 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.772821 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.772893 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.772913 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.797335 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.797917 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.797480 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.798155 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.797448 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.798425 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.797491 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.798695 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.855719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.855799 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.855994 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.856091 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.856114 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:01:43.856084493 +0000 UTC m=+151.364873550 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.856278 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 10:01:43.856243638 +0000 UTC m=+151.365032755 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.875930 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.875981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.875992 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.876013 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.876025 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.957138 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.957972 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.957445 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958236 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958113 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958476 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958524 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958655 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 10:01:43.958610919 +0000 UTC m=+151.467399996 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958355 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:39 crc kubenswrapper[4958]: E1201 10:00:39.958731 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 10:01:43.958717522 +0000 UTC m=+151.467506599 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.979180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.979246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.979284 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.979330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:39 crc kubenswrapper[4958]: I1201 10:00:39.979356 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:39Z","lastTransitionTime":"2025-12-01T10:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.082753 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.082894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.082918 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.082945 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.082963 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.186296 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.186360 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.186385 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.186473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.186498 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.289903 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.289958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.289969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.289990 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.290005 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.393168 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.393240 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.393260 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.393289 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.393308 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.496063 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.496106 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.496115 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.496134 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.496145 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.599717 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.599785 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.599794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.599811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.599821 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.704149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.704228 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.704252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.704305 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.704330 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.811689 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.811835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.811986 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.812043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.812106 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.915927 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.916042 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.916068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.916123 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:40 crc kubenswrapper[4958]: I1201 10:00:40.916138 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:40Z","lastTransitionTime":"2025-12-01T10:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.019325 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.019563 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.019582 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.019637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.019653 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.123054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.123115 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.123125 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.123140 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.123150 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.225772 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.225815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.225824 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.225861 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.225874 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.329064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.329131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.329155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.329188 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.329210 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.433065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.433122 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.433137 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.433159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.433173 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.537999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.538067 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.538078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.538099 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.538113 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.640750 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.640812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.640826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.640892 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.640907 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.744086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.744135 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.744145 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.744163 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.744173 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.797245 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.797387 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:41 crc kubenswrapper[4958]: E1201 10:00:41.797432 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.797245 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:41 crc kubenswrapper[4958]: E1201 10:00:41.797681 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:41 crc kubenswrapper[4958]: E1201 10:00:41.797736 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.797273 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:41 crc kubenswrapper[4958]: E1201 10:00:41.797802 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.847235 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.847317 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.847341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.847372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.847392 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.950614 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.950771 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.950793 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.950818 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:41 crc kubenswrapper[4958]: I1201 10:00:41.950832 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:41Z","lastTransitionTime":"2025-12-01T10:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.054101 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.054173 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.054193 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.054221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.054241 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.157488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.157561 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.157571 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.157592 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.157602 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.261251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.261309 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.261321 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.261345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.261356 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.365121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.365189 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.365208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.365277 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.365317 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.467939 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.467985 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.467995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.468012 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.468023 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.572059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.572143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.572165 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.572194 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.572216 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.675549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.675637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.675658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.675690 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.675714 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.778349 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.778423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.778437 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.778459 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.778472 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.881630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.881696 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.881717 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.881749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.881771 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.984322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.984361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.984372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.984389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:42 crc kubenswrapper[4958]: I1201 10:00:42.984399 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:42Z","lastTransitionTime":"2025-12-01T10:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.087664 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.087722 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.087735 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.087754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.087769 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.191003 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.191055 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.191069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.191086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.191098 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.293364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.293412 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.293427 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.293455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.293469 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.395732 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.395785 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.395798 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.395814 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.395824 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.498584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.498651 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.498667 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.498695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.498709 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.601593 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.601653 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.601668 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.601733 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.601755 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.704981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.705031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.705042 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.705059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.705072 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.796899 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.797092 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:43 crc kubenswrapper[4958]: E1201 10:00:43.797120 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.797227 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.797233 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:43 crc kubenswrapper[4958]: E1201 10:00:43.797353 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:43 crc kubenswrapper[4958]: E1201 10:00:43.797588 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:43 crc kubenswrapper[4958]: E1201 10:00:43.797642 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.798500 4958 scope.go:117] "RemoveContainer" containerID="362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.807907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.808009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.808027 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.808078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.808092 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.814476 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.830936 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.846359 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.866236 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.878399 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.893022 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.906434 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.910938 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.910978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.910991 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.911010 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.911023 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:43Z","lastTransitionTime":"2025-12-01T10:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.917749 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.930596 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:33Z\\\",\\\"message\\\":\\\"2025-12-01T09:59:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521\\\\n2025-12-01T09:59:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521 to /host/opt/cni/bin/\\\\n2025-12-01T09:59:48Z [verbose] multus-daemon started\\\\n2025-12-01T09:59:48Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:00:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.942227 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.952542 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.965875 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.977739 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:43 crc kubenswrapper[4958]: I1201 10:00:43.990866 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:43Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.005124 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.014047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.014088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.014099 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.014118 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.014131 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.017948 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.034077 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.048533 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.116655 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.116699 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.116711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.116736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.116750 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.220034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.220094 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.220110 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.220131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.220147 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.323480 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.323734 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.323751 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.323769 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.323781 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.431982 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.432045 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.432059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.432084 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.432098 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.538324 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.538414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.538432 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.538469 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.538486 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.641384 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.641431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.641445 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.641465 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.641475 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.721877 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/2.log" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.725336 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.726558 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.739976 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.744281 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.744344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.744365 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.744394 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.744415 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.758572 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.768909 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.783145 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.795635 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.815263 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.827900 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.838154 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.848116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.848166 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.848183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.848202 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.848215 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.853526 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.865642 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.892055 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.909008 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.925409 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.944194 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.951137 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.951196 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.951215 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.951236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.951248 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:44Z","lastTransitionTime":"2025-12-01T10:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.955716 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.969955 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:33Z\\\",\\\"message\\\":\\\"2025-12-01T09:59:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521\\\\n2025-12-01T09:59:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521 to /host/opt/cni/bin/\\\\n2025-12-01T09:59:48Z [verbose] multus-daemon started\\\\n2025-12-01T09:59:48Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:00:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:44 crc kubenswrapper[4958]: I1201 10:00:44.985525 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.000655 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:44Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.055283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.055559 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.055573 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.055594 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.055607 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.159601 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.159683 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.159697 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.159719 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.159731 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.214455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.214504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.214514 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.214530 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.214540 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.231774 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.236815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.236889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.236904 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.236929 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.236946 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.250501 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.254944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.255000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.255014 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.255037 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.255048 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.268800 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.273074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.273136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.273147 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.273174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.273188 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.284790 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.288045 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.288096 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.288108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.288126 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.288136 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.300395 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"150e7b6c-bfd8-4984-9394-004cd9e4353a\\\",\\\"systemUUID\\\":\\\"dc41de36-78ed-40fc-8073-01c3beb6f3e3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.300554 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.302543 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.302582 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.302598 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.302620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.302633 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.405520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.405576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.405586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.405605 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.405617 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.509104 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.509576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.509588 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.509606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.509618 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.613055 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.613111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.613122 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.613143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.613154 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.716544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.716607 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.716619 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.716643 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.716658 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.732315 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/3.log" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.733407 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/2.log" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.737974 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" exitCode=1 Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.738032 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.738128 4958 scope.go:117] "RemoveContainer" containerID="362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.740336 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.744256 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.757694 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.772061 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.785670 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.796768 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.796807 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.796958 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.796986 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.797123 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.797148 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.797273 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:45 crc kubenswrapper[4958]: E1201 10:00:45.797338 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.801279 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.819981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.820032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.820044 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.820063 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.820075 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.824405 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362a7bca7d9b89f9c7abd69f4f98e1c249ee027dd73f52ec8db297cf1152a57b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:13Z\\\",\\\"message\\\":\\\":00:13.744888 6597 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.744944 6597 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1201 10:00:13.746161 6597 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 10:00:13.746211 6597 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 10:00:13.746241 6597 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 10:00:13.746254 6597 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 10:00:13.746277 6597 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 10:00:13.746293 6597 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 10:00:13.746309 6597 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 10:00:13.746324 6597 factory.go:656] Stopping watch factory\\\\nI1201 10:00:13.746340 6597 ovnkube.go:599] Stopped ovnkube\\\\nI1201 10:00:13.746375 6597 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 10:00:13.746375 6597 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"dns/node-resolver-tsq6f openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-multus/multus-7z6wb openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-6b9wz]\\\\nI1201 10:00:45.433630 6987 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1201 10:00:45.433645 6987 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI1201 10:00:45.433629 6987 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.838097 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.854352 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.867519 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.878711 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.891425 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:33Z\\\",\\\"message\\\":\\\"2025-12-01T09:59:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521\\\\n2025-12-01T09:59:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521 to /host/opt/cni/bin/\\\\n2025-12-01T09:59:48Z [verbose] multus-daemon started\\\\n2025-12-01T09:59:48Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:00:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.906305 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.919724 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.923068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.923147 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.923198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.923220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.923231 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:45Z","lastTransitionTime":"2025-12-01T10:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.936038 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.950452 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.966673 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.980635 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:45 crc kubenswrapper[4958]: I1201 10:00:45.992430 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.005529 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.026676 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.026720 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.026730 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.026746 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.026759 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.129526 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.129574 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.129589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.129611 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.129625 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.232330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.232427 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.232440 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.232462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.232475 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.335369 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.335420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.335431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.335450 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.335460 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.438259 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.438323 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.438336 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.438358 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.438371 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.541460 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.541506 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.541518 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.541570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.541582 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.644691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.644796 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.644811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.644878 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.644895 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.743834 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/3.log" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.747339 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.747404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.747421 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.747441 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.747647 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.752415 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:00:46 crc kubenswrapper[4958]: E1201 10:00:46.752836 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.768075 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.786721 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e02b5affaec222dd3155dcac34346e0fbfd3a57500a2199e85248f637d3ffde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1315162e21f76b2db63b70632eea127feb9d892425c15b265392b37a5461ee72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.802284 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tsq6f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92bb0597-cb74-4cba-b6f6-e52266b1aa59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8caab4757c7d41eb35580d58573d554b7f368177431ab7f85afeb970d48ab768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wmgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tsq6f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.816880 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a19e0dac-64a6-4b41-80e7-cc90db58399d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:59:36Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:59:29.579019 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:59:29.580160 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1212236861/tls.crt::/tmp/serving-cert-1212236861/tls.key\\\\\\\"\\\\nI1201 09:59:35.703045 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:59:35.705278 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:59:35.705301 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:59:35.705329 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:59:35.705336 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:59:35.714503 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:59:35.714534 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714540 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:59:35.714545 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:59:35.714549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:59:35.714552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:59:35.714556 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:59:35.714730 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:59:35.716642 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.832288 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e9ce6b1-cb1f-4096-a1f5-bbf60cabb854\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2565295d7da7e255a57fb7260f4f718ee668795b4b62a646542657719124a1ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7b49fbb419266621d8a0eaa563b793a09862d34565232952afe8d548860a3e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3322c1dfa679a228ba863fe8b3566ca3451adc50f5892de76dcced0262432ae0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.847123 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc36d7322055df8f97111911c15c9ee274112b3013d98a4be92b1bcf8c7de0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.850985 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.851048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.851080 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.851099 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.851109 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.862371 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.875924 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e82da7115c4bed780a1ec282e92d95bde6b4927de33b87aab6ba22a0718006d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.889675 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-htfxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f3ade3-c19a-49ac-a22b-3a4348f374f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0c6196d3ef2a21a7b1f2afa6ad9ee96120248bd87c66255bf11a68b2a086be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz266\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-htfxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.905479 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"669faf62-d047-4858-aabc-919f9decc4cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c86d33dc457240b5c41df28d2a66575e809398a39ec20393f9fe7ac3c8bf6b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e90aae1e7541f31f237a924d640715a3fb520b6dcae23764c95c936bf7f53321\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbf650ba434e831572b4456cb8d9c93e348e74cb531e2b8cba926fb58527cae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c0c071cb1eac45faeee2f564428279f99090fed15bc44ae5ea4e7b298fcc840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.919272 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74916556-b3e2-41dd-8290-cc3060ad0910\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abda4376bb42f96914bc34670c8ce69ebbe304266b9d68a0965ee329021cf84a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://241d5a9bef99759092f4f1c042f8489f196853a0817159c8f5c47f86b9173856\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.934172 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.947381 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09a41414-b5bf-481a-afdc-b0042f4c78b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bdf39897ba6956f9e2b552ac10bec52afc4c2466a62c6dc3eaf6577c24b9525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6qlxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-prmw7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.952882 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.952921 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.952933 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.952953 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.952966 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:46Z","lastTransitionTime":"2025-12-01T10:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.964151 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4vh77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"00cc61ff-219a-40e4-a0c3-360c456c57f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc43fa2fc566f7c3daa21a331c112dbdd49b8025d8edc1e8a6ffc592efb3965d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336bd3bcd94f675d2b14a71c21c12e997ac26b1e0d127df0108e65ef84b09e46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7da76bc15b56ea1f29eacb2189c7188e1d53f79322f89ab016f811f76d304524\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7211192e27892b6e66358e967777c47871de6c411cc83bbfd8c205fb862afc2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccf58fb02e04efc6d9ddd44a74cb8d189ca55abf8b26d8dc1f44296c2384b173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a5a166f7e9a2104f1a90d4465f1e032e91dd735a912a841b2e6af8942a7b8c4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa39d702ac56573addd5e73e9c4b7dec9865151aeaa0285160678f4a1c626bbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nr7v5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4vh77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.982865 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96173cf0-4be1-4ef7-b063-4c93c1731c20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:45Z\\\",\\\"message\\\":\\\"dns/node-resolver-tsq6f openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-multus/multus-7z6wb openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-6b9wz]\\\\nI1201 10:00:45.433630 6987 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF1201 10:00:45.433645 6987 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:45Z is after 2025-08-24T17:21:41Z]\\\\nI1201 10:00:45.433629 6987 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T10:00:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:59:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rpxpt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-976fz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:46 crc kubenswrapper[4958]: I1201 10:00:46.994986 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7z6wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46276a58-9607-4a8a-bcfc-ca41ab441ec2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T10:00:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T10:00:33Z\\\",\\\"message\\\":\\\"2025-12-01T09:59:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521\\\\n2025-12-01T09:59:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dfa7b07d-3168-4b70-af00-2256ac06b521 to /host/opt/cni/bin/\\\\n2025-12-01T09:59:48Z [verbose] multus-daemon started\\\\n2025-12-01T09:59:48Z [verbose] Readiness Indicator file check\\\\n2025-12-01T10:00:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:59:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T10:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lvm9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7z6wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:46Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.007273 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f88031e-1c6c-4d5a-9648-a64ec5c5147f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34130a383e5f9344e4be10b698be6b29df160bb1cd004e81b4b4eaf7f4a24182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9fa859114d9bc3578164dd7c433bff0ffdc8a85d323df4c04f3f9e35a927a21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:59:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tbglv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rd8vk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.018236 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"987c6a26-52be-40a5-b9cc-456d9731436f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:59:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj978\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:59:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6b9wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T10:00:47Z is after 2025-08-24T17:21:41Z" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.056235 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.056278 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.056288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.056305 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.056314 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.160593 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.160707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.160725 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.160752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.160777 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.264364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.264451 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.264476 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.264524 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.264550 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.367787 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.367884 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.367904 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.367935 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.367952 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.472661 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.472707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.472718 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.472736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.472747 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.576416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.576487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.576502 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.576523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.576534 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.678798 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.678878 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.678891 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.678912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.678926 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.781973 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.782027 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.782043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.782064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.782077 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.797484 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.797608 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:47 crc kubenswrapper[4958]: E1201 10:00:47.797656 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.797701 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.797513 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:47 crc kubenswrapper[4958]: E1201 10:00:47.797822 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:47 crc kubenswrapper[4958]: E1201 10:00:47.798036 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:47 crc kubenswrapper[4958]: E1201 10:00:47.798061 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.884480 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.884559 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.884600 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.884643 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.884667 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.987471 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.987535 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.987551 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.987574 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:47 crc kubenswrapper[4958]: I1201 10:00:47.987590 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:47Z","lastTransitionTime":"2025-12-01T10:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.090606 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.090682 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.090702 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.090732 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.090749 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.193649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.193697 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.193708 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.193726 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.193736 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.296865 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.296942 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.296957 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.296984 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.297002 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.399996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.400070 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.400086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.400111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.400129 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.503335 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.503380 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.503391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.503409 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.503419 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.606637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.606715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.606744 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.606777 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.606796 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.710562 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.710612 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.710622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.710638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.710647 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.814364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.814452 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.814476 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.814513 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.814537 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.917054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.917114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.917124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.917142 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:48 crc kubenswrapper[4958]: I1201 10:00:48.917155 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:48Z","lastTransitionTime":"2025-12-01T10:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.020814 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.020934 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.020960 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.020989 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.021010 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.123798 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.123893 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.123910 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.123934 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.123950 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.226446 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.226490 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.226503 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.226526 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.226538 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.330493 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.330548 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.330565 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.330618 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.330636 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.433541 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.433601 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.433612 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.433631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.433644 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.537376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.537444 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.537462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.537490 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.537514 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.640826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.640889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.640900 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.640916 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.640927 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.745000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.745070 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.745091 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.745115 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.745131 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.797465 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.797617 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.797824 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.797881 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:49 crc kubenswrapper[4958]: E1201 10:00:49.797832 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:49 crc kubenswrapper[4958]: E1201 10:00:49.798019 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:49 crc kubenswrapper[4958]: E1201 10:00:49.798150 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:49 crc kubenswrapper[4958]: E1201 10:00:49.798370 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.853294 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.853372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.853397 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.853423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.853438 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.957113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.957223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.957256 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.957295 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:49 crc kubenswrapper[4958]: I1201 10:00:49.957323 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:49Z","lastTransitionTime":"2025-12-01T10:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.060961 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.061017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.061028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.061048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.061061 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.164545 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.164610 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.164625 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.164647 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.164660 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.268310 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.268360 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.268379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.268399 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.268411 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.371345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.371398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.371409 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.371445 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.371457 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.474701 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.474749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.474758 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.474779 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.474790 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.577345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.577415 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.577427 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.577448 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.577461 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.680668 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.680735 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.680750 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.680777 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.680790 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.784994 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.785064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.785095 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.785122 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.785141 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.888442 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.888524 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.888540 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.888561 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.888600 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.991366 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.991416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.991429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.991450 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:50 crc kubenswrapper[4958]: I1201 10:00:50.991466 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:50Z","lastTransitionTime":"2025-12-01T10:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.093928 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.093995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.094009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.094030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.094043 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.196636 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.196699 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.196709 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.196729 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.196739 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.300149 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.300221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.300239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.300261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.300289 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.403285 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.403331 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.403343 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.403362 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.403376 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.507205 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.507261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.507276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.507295 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.507306 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.609995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.610062 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.610076 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.610098 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.610111 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.714307 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.714378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.714391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.714408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.714420 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.797441 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:51 crc kubenswrapper[4958]: E1201 10:00:51.797680 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.797471 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.797501 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.797471 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:51 crc kubenswrapper[4958]: E1201 10:00:51.798265 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:51 crc kubenswrapper[4958]: E1201 10:00:51.798407 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:51 crc kubenswrapper[4958]: E1201 10:00:51.798527 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.816933 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.816986 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.816999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.817021 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.817034 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.818164 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.920526 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.920604 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.920622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.920646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:51 crc kubenswrapper[4958]: I1201 10:00:51.920659 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:51Z","lastTransitionTime":"2025-12-01T10:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.024270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.024353 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.024368 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.024392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.024406 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.128124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.128176 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.128187 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.128206 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.128217 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.232132 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.232184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.232195 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.232213 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.232225 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.334781 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.334823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.334839 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.334883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.334895 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.437342 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.437398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.437414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.437434 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.437446 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.540940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.540999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.541011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.541031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.541048 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.644141 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.644226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.644246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.644275 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.644293 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.747607 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.747689 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.747711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.747739 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.747757 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.851030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.851083 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.851098 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.851116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.851126 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.953869 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.953924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.953939 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.953962 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:52 crc kubenswrapper[4958]: I1201 10:00:52.953975 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:52Z","lastTransitionTime":"2025-12-01T10:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.058008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.058089 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.058112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.058144 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.058167 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.162650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.162706 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.162724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.162744 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.162756 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.266605 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.266650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.266662 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.266682 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.266694 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.370099 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.370185 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.370212 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.370248 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.370273 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.474012 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.474069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.474090 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.474117 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.474130 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.576706 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.576756 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.576769 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.576788 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.576801 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.679775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.679822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.679831 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.679869 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.679879 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.782010 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.782059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.782069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.782088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.782102 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.797395 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.797459 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.797524 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:53 crc kubenswrapper[4958]: E1201 10:00:53.797555 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.797773 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:53 crc kubenswrapper[4958]: E1201 10:00:53.797777 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:53 crc kubenswrapper[4958]: E1201 10:00:53.797802 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:53 crc kubenswrapper[4958]: E1201 10:00:53.798066 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.846822 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7z6wb" podStartSLOduration=76.846765462 podStartE2EDuration="1m16.846765462s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:53.825741866 +0000 UTC m=+101.334530913" watchObservedRunningTime="2025-12-01 10:00:53.846765462 +0000 UTC m=+101.355554539" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.870747 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rd8vk" podStartSLOduration=75.870723416 podStartE2EDuration="1m15.870723416s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:53.852030269 +0000 UTC m=+101.360819336" watchObservedRunningTime="2025-12-01 10:00:53.870723416 +0000 UTC m=+101.379512463" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.884475 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.884536 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.884550 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.884571 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.884590 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.936295 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tsq6f" podStartSLOduration=76.936251409 podStartE2EDuration="1m16.936251409s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:53.923357165 +0000 UTC m=+101.432146202" watchObservedRunningTime="2025-12-01 10:00:53.936251409 +0000 UTC m=+101.445040446" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.954730 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.954709359 podStartE2EDuration="1m17.954709359s" podCreationTimestamp="2025-12-01 09:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:53.954536924 +0000 UTC m=+101.463325971" watchObservedRunningTime="2025-12-01 10:00:53.954709359 +0000 UTC m=+101.463498396" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.987573 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.987623 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.987637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.987656 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.987666 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:53Z","lastTransitionTime":"2025-12-01T10:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.991476 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.991454324 podStartE2EDuration="1m15.991454324s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:53.967933003 +0000 UTC m=+101.476722040" watchObservedRunningTime="2025-12-01 10:00:53.991454324 +0000 UTC m=+101.500243361" Dec 01 10:00:53 crc kubenswrapper[4958]: I1201 10:00:53.992094 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=2.992087743 podStartE2EDuration="2.992087743s" podCreationTimestamp="2025-12-01 10:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:53.991537267 +0000 UTC m=+101.500326374" watchObservedRunningTime="2025-12-01 10:00:53.992087743 +0000 UTC m=+101.500876780" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.070145 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-htfxn" podStartSLOduration=77.070118958 podStartE2EDuration="1m17.070118958s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:54.070108748 +0000 UTC m=+101.578897785" watchObservedRunningTime="2025-12-01 10:00:54.070118958 +0000 UTC m=+101.578907995" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.091026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.091066 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.091078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.091096 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.091109 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.095237 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.095212646 podStartE2EDuration="49.095212646s" podCreationTimestamp="2025-12-01 10:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:54.093956069 +0000 UTC m=+101.602745116" watchObservedRunningTime="2025-12-01 10:00:54.095212646 +0000 UTC m=+101.604001683" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.110827 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.110803631 podStartE2EDuration="23.110803631s" podCreationTimestamp="2025-12-01 10:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:54.110340827 +0000 UTC m=+101.619129874" watchObservedRunningTime="2025-12-01 10:00:54.110803631 +0000 UTC m=+101.619592668" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.137133 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podStartSLOduration=77.137112705 podStartE2EDuration="1m17.137112705s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:54.136441755 +0000 UTC m=+101.645230792" watchObservedRunningTime="2025-12-01 10:00:54.137112705 +0000 UTC m=+101.645901732" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.153476 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4vh77" podStartSLOduration=77.153449552 podStartE2EDuration="1m17.153449552s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:54.153433341 +0000 UTC m=+101.662222378" watchObservedRunningTime="2025-12-01 10:00:54.153449552 +0000 UTC m=+101.662238589" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.194066 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.194125 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.194135 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.194155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.194451 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.296911 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.297202 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.297272 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.297386 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.297450 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.402100 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.402548 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.402714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.402898 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.403088 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.506932 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.506989 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.506999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.507018 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.507031 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.610487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.610560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.610585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.610622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.610646 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.713315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.713355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.713364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.713379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.713389 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.816459 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.816527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.816546 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.816568 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.816581 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.919801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.919883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.919896 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.919914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:54 crc kubenswrapper[4958]: I1201 10:00:54.919929 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:54Z","lastTransitionTime":"2025-12-01T10:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.023224 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.023285 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.023300 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.023331 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.023347 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:55Z","lastTransitionTime":"2025-12-01T10:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.127004 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.127088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.127106 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.127127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.127140 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:55Z","lastTransitionTime":"2025-12-01T10:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.230362 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.230433 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.230447 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.230468 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.230482 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:55Z","lastTransitionTime":"2025-12-01T10:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.333993 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.334050 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.334064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.334087 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.334102 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:55Z","lastTransitionTime":"2025-12-01T10:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.397885 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.397955 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.397973 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.398001 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.398025 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T10:00:55Z","lastTransitionTime":"2025-12-01T10:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.483519 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn"] Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.484099 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.487348 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.488496 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.488693 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.488927 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.573323 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5e83d982-bd60-4a25-b51f-aef645d6df6e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.573806 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5e83d982-bd60-4a25-b51f-aef645d6df6e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.573908 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e83d982-bd60-4a25-b51f-aef645d6df6e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.573982 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e83d982-bd60-4a25-b51f-aef645d6df6e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.574129 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e83d982-bd60-4a25-b51f-aef645d6df6e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675205 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e83d982-bd60-4a25-b51f-aef645d6df6e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675337 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5e83d982-bd60-4a25-b51f-aef645d6df6e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675401 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5e83d982-bd60-4a25-b51f-aef645d6df6e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675431 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e83d982-bd60-4a25-b51f-aef645d6df6e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e83d982-bd60-4a25-b51f-aef645d6df6e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675580 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5e83d982-bd60-4a25-b51f-aef645d6df6e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.675607 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5e83d982-bd60-4a25-b51f-aef645d6df6e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.676523 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e83d982-bd60-4a25-b51f-aef645d6df6e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.682652 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e83d982-bd60-4a25-b51f-aef645d6df6e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.695058 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e83d982-bd60-4a25-b51f-aef645d6df6e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p2ssn\" (UID: \"5e83d982-bd60-4a25-b51f-aef645d6df6e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.797320 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.797546 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:55 crc kubenswrapper[4958]: E1201 10:00:55.797544 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.797664 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:55 crc kubenswrapper[4958]: E1201 10:00:55.797714 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.797335 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:55 crc kubenswrapper[4958]: E1201 10:00:55.797962 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:55 crc kubenswrapper[4958]: E1201 10:00:55.798160 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:55 crc kubenswrapper[4958]: I1201 10:00:55.813552 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" Dec 01 10:00:56 crc kubenswrapper[4958]: I1201 10:00:56.484017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:56 crc kubenswrapper[4958]: E1201 10:00:56.484294 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:56 crc kubenswrapper[4958]: E1201 10:00:56.484445 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs podName:987c6a26-52be-40a5-b9cc-456d9731436f nodeName:}" failed. No retries permitted until 2025-12-01 10:02:00.484401328 +0000 UTC m=+167.993190415 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs") pod "network-metrics-daemon-6b9wz" (UID: "987c6a26-52be-40a5-b9cc-456d9731436f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 10:00:56 crc kubenswrapper[4958]: I1201 10:00:56.788752 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" event={"ID":"5e83d982-bd60-4a25-b51f-aef645d6df6e","Type":"ContainerStarted","Data":"0e997be1100e48db1a6ed5c60c34a273337deaab0e5fd5d2a92b49ec3943092b"} Dec 01 10:00:56 crc kubenswrapper[4958]: I1201 10:00:56.788832 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" event={"ID":"5e83d982-bd60-4a25-b51f-aef645d6df6e","Type":"ContainerStarted","Data":"385f2bcba6f6dc5bc1176407f67ea628ec2f99a3672807d1f35c83796a13886e"} Dec 01 10:00:56 crc kubenswrapper[4958]: I1201 10:00:56.807882 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p2ssn" podStartSLOduration=79.807836238 podStartE2EDuration="1m19.807836238s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:00:56.807454886 +0000 UTC m=+104.316243923" watchObservedRunningTime="2025-12-01 10:00:56.807836238 +0000 UTC m=+104.316625275" Dec 01 10:00:57 crc kubenswrapper[4958]: I1201 10:00:57.797038 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:57 crc kubenswrapper[4958]: I1201 10:00:57.797107 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:57 crc kubenswrapper[4958]: I1201 10:00:57.797207 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:57 crc kubenswrapper[4958]: I1201 10:00:57.797277 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:57 crc kubenswrapper[4958]: E1201 10:00:57.797296 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:57 crc kubenswrapper[4958]: E1201 10:00:57.797332 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:57 crc kubenswrapper[4958]: E1201 10:00:57.797389 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:00:57 crc kubenswrapper[4958]: E1201 10:00:57.797437 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:59 crc kubenswrapper[4958]: I1201 10:00:59.797147 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:00:59 crc kubenswrapper[4958]: E1201 10:00:59.797433 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:00:59 crc kubenswrapper[4958]: I1201 10:00:59.797214 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:00:59 crc kubenswrapper[4958]: E1201 10:00:59.797613 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:00:59 crc kubenswrapper[4958]: I1201 10:00:59.797154 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:00:59 crc kubenswrapper[4958]: I1201 10:00:59.797193 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:00:59 crc kubenswrapper[4958]: E1201 10:00:59.797718 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:00:59 crc kubenswrapper[4958]: E1201 10:00:59.797766 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:00 crc kubenswrapper[4958]: I1201 10:01:00.798900 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:01:00 crc kubenswrapper[4958]: E1201 10:01:00.799246 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:01:01 crc kubenswrapper[4958]: I1201 10:01:01.797376 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:01 crc kubenswrapper[4958]: I1201 10:01:01.797429 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:01 crc kubenswrapper[4958]: I1201 10:01:01.797488 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:01 crc kubenswrapper[4958]: E1201 10:01:01.797560 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:01 crc kubenswrapper[4958]: I1201 10:01:01.797603 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:01 crc kubenswrapper[4958]: E1201 10:01:01.797758 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:01 crc kubenswrapper[4958]: E1201 10:01:01.797848 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:01 crc kubenswrapper[4958]: E1201 10:01:01.798108 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:03 crc kubenswrapper[4958]: I1201 10:01:03.796871 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:03 crc kubenswrapper[4958]: I1201 10:01:03.796885 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:03 crc kubenswrapper[4958]: I1201 10:01:03.796967 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:03 crc kubenswrapper[4958]: I1201 10:01:03.798911 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:03 crc kubenswrapper[4958]: E1201 10:01:03.798935 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:03 crc kubenswrapper[4958]: E1201 10:01:03.799028 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:03 crc kubenswrapper[4958]: E1201 10:01:03.799162 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:03 crc kubenswrapper[4958]: E1201 10:01:03.799250 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:05 crc kubenswrapper[4958]: I1201 10:01:05.797087 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:05 crc kubenswrapper[4958]: I1201 10:01:05.797146 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:05 crc kubenswrapper[4958]: I1201 10:01:05.797219 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:05 crc kubenswrapper[4958]: I1201 10:01:05.797308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:05 crc kubenswrapper[4958]: E1201 10:01:05.797887 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:05 crc kubenswrapper[4958]: E1201 10:01:05.798191 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:05 crc kubenswrapper[4958]: E1201 10:01:05.798256 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:05 crc kubenswrapper[4958]: E1201 10:01:05.798435 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:07 crc kubenswrapper[4958]: I1201 10:01:07.796689 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:07 crc kubenswrapper[4958]: I1201 10:01:07.796811 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:07 crc kubenswrapper[4958]: I1201 10:01:07.797238 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:07 crc kubenswrapper[4958]: I1201 10:01:07.797272 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:07 crc kubenswrapper[4958]: E1201 10:01:07.797520 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:07 crc kubenswrapper[4958]: E1201 10:01:07.797594 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:07 crc kubenswrapper[4958]: E1201 10:01:07.797704 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:07 crc kubenswrapper[4958]: E1201 10:01:07.797879 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:09 crc kubenswrapper[4958]: I1201 10:01:09.796553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:09 crc kubenswrapper[4958]: I1201 10:01:09.796637 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:09 crc kubenswrapper[4958]: I1201 10:01:09.796799 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:09 crc kubenswrapper[4958]: E1201 10:01:09.797001 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:09 crc kubenswrapper[4958]: I1201 10:01:09.797042 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:09 crc kubenswrapper[4958]: E1201 10:01:09.797168 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:09 crc kubenswrapper[4958]: E1201 10:01:09.797284 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:09 crc kubenswrapper[4958]: E1201 10:01:09.797540 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:11 crc kubenswrapper[4958]: I1201 10:01:11.797354 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:11 crc kubenswrapper[4958]: I1201 10:01:11.797469 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:11 crc kubenswrapper[4958]: E1201 10:01:11.797545 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:11 crc kubenswrapper[4958]: I1201 10:01:11.797609 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:11 crc kubenswrapper[4958]: I1201 10:01:11.797504 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:11 crc kubenswrapper[4958]: E1201 10:01:11.797691 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:11 crc kubenswrapper[4958]: E1201 10:01:11.797814 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:11 crc kubenswrapper[4958]: E1201 10:01:11.797896 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.656831 4958 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 10:01:13 crc kubenswrapper[4958]: I1201 10:01:13.797126 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:13 crc kubenswrapper[4958]: I1201 10:01:13.797204 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:13 crc kubenswrapper[4958]: I1201 10:01:13.797179 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:13 crc kubenswrapper[4958]: I1201 10:01:13.800075 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:13 crc kubenswrapper[4958]: I1201 10:01:13.800423 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.800422 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.800635 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.800830 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.800975 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.801182 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-976fz_openshift-ovn-kubernetes(96173cf0-4be1-4ef7-b063-4c93c1731c20)\"" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" Dec 01 10:01:13 crc kubenswrapper[4958]: E1201 10:01:13.888726 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:01:15 crc kubenswrapper[4958]: I1201 10:01:15.797308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:15 crc kubenswrapper[4958]: I1201 10:01:15.797354 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:15 crc kubenswrapper[4958]: I1201 10:01:15.797435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:15 crc kubenswrapper[4958]: I1201 10:01:15.797506 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:15 crc kubenswrapper[4958]: E1201 10:01:15.797545 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:15 crc kubenswrapper[4958]: E1201 10:01:15.797675 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:15 crc kubenswrapper[4958]: E1201 10:01:15.797776 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:15 crc kubenswrapper[4958]: E1201 10:01:15.798123 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:17 crc kubenswrapper[4958]: I1201 10:01:17.797331 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:17 crc kubenswrapper[4958]: I1201 10:01:17.797404 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:17 crc kubenswrapper[4958]: I1201 10:01:17.797458 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:17 crc kubenswrapper[4958]: I1201 10:01:17.797522 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:17 crc kubenswrapper[4958]: E1201 10:01:17.797521 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:17 crc kubenswrapper[4958]: E1201 10:01:17.797684 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:17 crc kubenswrapper[4958]: E1201 10:01:17.797746 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:17 crc kubenswrapper[4958]: E1201 10:01:17.797784 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:18 crc kubenswrapper[4958]: E1201 10:01:18.890512 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:01:19 crc kubenswrapper[4958]: I1201 10:01:19.797083 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:19 crc kubenswrapper[4958]: I1201 10:01:19.797113 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:19 crc kubenswrapper[4958]: I1201 10:01:19.797194 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:19 crc kubenswrapper[4958]: E1201 10:01:19.797281 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:19 crc kubenswrapper[4958]: I1201 10:01:19.797363 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:19 crc kubenswrapper[4958]: E1201 10:01:19.797450 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:19 crc kubenswrapper[4958]: E1201 10:01:19.797583 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:19 crc kubenswrapper[4958]: E1201 10:01:19.797618 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.796811 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.796943 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:21 crc kubenswrapper[4958]: E1201 10:01:21.797029 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.797047 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.797077 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:21 crc kubenswrapper[4958]: E1201 10:01:21.797194 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:21 crc kubenswrapper[4958]: E1201 10:01:21.797403 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:21 crc kubenswrapper[4958]: E1201 10:01:21.797501 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.887371 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/1.log" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.887956 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/0.log" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.888023 4958 generic.go:334] "Generic (PLEG): container finished" podID="46276a58-9607-4a8a-bcfc-ca41ab441ec2" containerID="a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e" exitCode=1 Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.888070 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerDied","Data":"a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e"} Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.888118 4958 scope.go:117] "RemoveContainer" containerID="7579836bed9e5eb04d7c63a5d222f49e19be64895081dd8ae4c3c5582027310c" Dec 01 10:01:21 crc kubenswrapper[4958]: I1201 10:01:21.888903 4958 scope.go:117] "RemoveContainer" containerID="a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e" Dec 01 10:01:21 crc kubenswrapper[4958]: E1201 10:01:21.890050 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7z6wb_openshift-multus(46276a58-9607-4a8a-bcfc-ca41ab441ec2)\"" pod="openshift-multus/multus-7z6wb" podUID="46276a58-9607-4a8a-bcfc-ca41ab441ec2" Dec 01 10:01:22 crc kubenswrapper[4958]: I1201 10:01:22.893783 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/1.log" Dec 01 10:01:23 crc kubenswrapper[4958]: I1201 10:01:23.797144 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:23 crc kubenswrapper[4958]: I1201 10:01:23.797191 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:23 crc kubenswrapper[4958]: I1201 10:01:23.797161 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:23 crc kubenswrapper[4958]: I1201 10:01:23.797153 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:23 crc kubenswrapper[4958]: E1201 10:01:23.798507 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:23 crc kubenswrapper[4958]: E1201 10:01:23.798608 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:23 crc kubenswrapper[4958]: E1201 10:01:23.798651 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:23 crc kubenswrapper[4958]: E1201 10:01:23.798726 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:23 crc kubenswrapper[4958]: E1201 10:01:23.891046 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:01:25 crc kubenswrapper[4958]: I1201 10:01:25.797091 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:25 crc kubenswrapper[4958]: I1201 10:01:25.797170 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:25 crc kubenswrapper[4958]: I1201 10:01:25.797831 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:25 crc kubenswrapper[4958]: I1201 10:01:25.798344 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:25 crc kubenswrapper[4958]: E1201 10:01:25.798294 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:25 crc kubenswrapper[4958]: E1201 10:01:25.798872 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:25 crc kubenswrapper[4958]: E1201 10:01:25.799105 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:25 crc kubenswrapper[4958]: E1201 10:01:25.799243 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:27 crc kubenswrapper[4958]: I1201 10:01:27.797418 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:27 crc kubenswrapper[4958]: I1201 10:01:27.797464 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:27 crc kubenswrapper[4958]: I1201 10:01:27.797430 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:27 crc kubenswrapper[4958]: I1201 10:01:27.797418 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:27 crc kubenswrapper[4958]: E1201 10:01:27.797661 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:27 crc kubenswrapper[4958]: E1201 10:01:27.797570 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:27 crc kubenswrapper[4958]: E1201 10:01:27.797744 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:27 crc kubenswrapper[4958]: E1201 10:01:27.797868 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:28 crc kubenswrapper[4958]: I1201 10:01:28.797505 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:01:28 crc kubenswrapper[4958]: E1201 10:01:28.893173 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:01:28 crc kubenswrapper[4958]: I1201 10:01:28.914603 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/3.log" Dec 01 10:01:28 crc kubenswrapper[4958]: I1201 10:01:28.916597 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerStarted","Data":"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435"} Dec 01 10:01:28 crc kubenswrapper[4958]: I1201 10:01:28.917923 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 10:01:28 crc kubenswrapper[4958]: I1201 10:01:28.951172 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podStartSLOduration=110.951134498 podStartE2EDuration="1m50.951134498s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:28.949626911 +0000 UTC m=+136.458415958" watchObservedRunningTime="2025-12-01 10:01:28.951134498 +0000 UTC m=+136.459923535" Dec 01 10:01:29 crc kubenswrapper[4958]: I1201 10:01:29.797401 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:29 crc kubenswrapper[4958]: I1201 10:01:29.797468 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:29 crc kubenswrapper[4958]: I1201 10:01:29.797545 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:29 crc kubenswrapper[4958]: E1201 10:01:29.797608 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:29 crc kubenswrapper[4958]: I1201 10:01:29.797685 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:29 crc kubenswrapper[4958]: E1201 10:01:29.797879 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:29 crc kubenswrapper[4958]: E1201 10:01:29.798063 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:29 crc kubenswrapper[4958]: E1201 10:01:29.798199 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:30 crc kubenswrapper[4958]: I1201 10:01:30.148671 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6b9wz"] Dec 01 10:01:30 crc kubenswrapper[4958]: I1201 10:01:30.148827 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:30 crc kubenswrapper[4958]: E1201 10:01:30.148945 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:31 crc kubenswrapper[4958]: I1201 10:01:31.796870 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:31 crc kubenswrapper[4958]: I1201 10:01:31.796880 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:31 crc kubenswrapper[4958]: I1201 10:01:31.796889 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:31 crc kubenswrapper[4958]: I1201 10:01:31.796907 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:31 crc kubenswrapper[4958]: E1201 10:01:31.797433 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:31 crc kubenswrapper[4958]: E1201 10:01:31.797540 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:31 crc kubenswrapper[4958]: E1201 10:01:31.797627 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:31 crc kubenswrapper[4958]: E1201 10:01:31.797693 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:32 crc kubenswrapper[4958]: I1201 10:01:32.797209 4958 scope.go:117] "RemoveContainer" containerID="a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e" Dec 01 10:01:33 crc kubenswrapper[4958]: I1201 10:01:33.796619 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:33 crc kubenswrapper[4958]: I1201 10:01:33.796661 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:33 crc kubenswrapper[4958]: I1201 10:01:33.796664 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:33 crc kubenswrapper[4958]: I1201 10:01:33.796701 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:33 crc kubenswrapper[4958]: E1201 10:01:33.798136 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:33 crc kubenswrapper[4958]: E1201 10:01:33.798209 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:33 crc kubenswrapper[4958]: E1201 10:01:33.798270 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:33 crc kubenswrapper[4958]: E1201 10:01:33.798424 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:33 crc kubenswrapper[4958]: E1201 10:01:33.894271 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:01:33 crc kubenswrapper[4958]: I1201 10:01:33.939354 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/1.log" Dec 01 10:01:33 crc kubenswrapper[4958]: I1201 10:01:33.939863 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerStarted","Data":"d698e3b7f37fc411ed5e6e7830ba2b95824e3ac78e832cb3a588b63feb7b4f6c"} Dec 01 10:01:35 crc kubenswrapper[4958]: I1201 10:01:35.797363 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:35 crc kubenswrapper[4958]: I1201 10:01:35.797382 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:35 crc kubenswrapper[4958]: I1201 10:01:35.797401 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:35 crc kubenswrapper[4958]: E1201 10:01:35.797599 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:35 crc kubenswrapper[4958]: I1201 10:01:35.797653 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:35 crc kubenswrapper[4958]: E1201 10:01:35.797838 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:35 crc kubenswrapper[4958]: E1201 10:01:35.798017 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:35 crc kubenswrapper[4958]: E1201 10:01:35.798298 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:37 crc kubenswrapper[4958]: I1201 10:01:37.797308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:37 crc kubenswrapper[4958]: I1201 10:01:37.797363 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:37 crc kubenswrapper[4958]: I1201 10:01:37.797324 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:37 crc kubenswrapper[4958]: I1201 10:01:37.797462 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:37 crc kubenswrapper[4958]: E1201 10:01:37.797478 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 10:01:37 crc kubenswrapper[4958]: E1201 10:01:37.797603 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 10:01:37 crc kubenswrapper[4958]: E1201 10:01:37.797652 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6b9wz" podUID="987c6a26-52be-40a5-b9cc-456d9731436f" Dec 01 10:01:37 crc kubenswrapper[4958]: E1201 10:01:37.797707 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.680633 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.797342 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.797391 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.797350 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.797350 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.800001 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.800272 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.800553 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.800565 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.800645 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 10:01:39 crc kubenswrapper[4958]: I1201 10:01:39.801146 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 10:01:43 crc kubenswrapper[4958]: I1201 10:01:43.707307 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:43 crc kubenswrapper[4958]: E1201 10:01:43.707556 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:03:45.70751553 +0000 UTC m=+273.216304567 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:43 crc kubenswrapper[4958]: I1201 10:01:43.909926 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:43 crc kubenswrapper[4958]: I1201 10:01:43.910013 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:43 crc kubenswrapper[4958]: I1201 10:01:43.910819 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:43 crc kubenswrapper[4958]: I1201 10:01:43.916292 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.011322 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.011465 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.017988 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.018331 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.020899 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.039017 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 10:01:44 crc kubenswrapper[4958]: W1201 10:01:44.236321 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-91ceee3f7d60a1794dcc87dc9ed9d7141d7b933d3200066a07cce44c0e8d0ab1 WatchSource:0}: Error finding container 91ceee3f7d60a1794dcc87dc9ed9d7141d7b933d3200066a07cce44c0e8d0ab1: Status 404 returned error can't find the container with id 91ceee3f7d60a1794dcc87dc9ed9d7141d7b933d3200066a07cce44c0e8d0ab1 Dec 01 10:01:44 crc kubenswrapper[4958]: W1201 10:01:44.267214 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c7505ca9280fe3464cf4afb87a1fca13ac0bcb6108f8752aeaaad42985f0cf8a WatchSource:0}: Error finding container c7505ca9280fe3464cf4afb87a1fca13ac0bcb6108f8752aeaaad42985f0cf8a: Status 404 returned error can't find the container with id c7505ca9280fe3464cf4afb87a1fca13ac0bcb6108f8752aeaaad42985f0cf8a Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.313806 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:44 crc kubenswrapper[4958]: W1201 10:01:44.491113 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-1e1a17d125b62cf4295e9b2e0f8fc5ef1cb9487c88d774900f6e07679073764a WatchSource:0}: Error finding container 1e1a17d125b62cf4295e9b2e0f8fc5ef1cb9487c88d774900f6e07679073764a: Status 404 returned error can't find the container with id 1e1a17d125b62cf4295e9b2e0f8fc5ef1cb9487c88d774900f6e07679073764a Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.985506 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c7505ca9280fe3464cf4afb87a1fca13ac0bcb6108f8752aeaaad42985f0cf8a"} Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.987144 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"91ceee3f7d60a1794dcc87dc9ed9d7141d7b933d3200066a07cce44c0e8d0ab1"} Dec 01 10:01:44 crc kubenswrapper[4958]: I1201 10:01:44.988323 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1e1a17d125b62cf4295e9b2e0f8fc5ef1cb9487c88d774900f6e07679073764a"} Dec 01 10:01:45 crc kubenswrapper[4958]: I1201 10:01:45.993062 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7310a4c4e9fd39786a0967820b674b9461199684426a95a8b461c886637dcd1e"} Dec 01 10:01:45 crc kubenswrapper[4958]: I1201 10:01:45.995366 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a6d3fd765e70ca403f23d2d19961869e384f619fc2267173a4df5469f5028724"} Dec 01 10:01:45 crc kubenswrapper[4958]: I1201 10:01:45.995450 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:01:45 crc kubenswrapper[4958]: I1201 10:01:45.997385 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"edd939fa6b9669b586b77c6b66e0c2b7415935a7b488bc91eeae7535f7626f13"} Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.186628 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.231566 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dzllj"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.232286 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.235224 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.235871 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.240593 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.240651 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.242921 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.242994 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-26pm4"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.243731 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.247493 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.247803 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.248238 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.248347 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.248686 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.248946 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.249111 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.249313 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.247814 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.249772 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.250129 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.250701 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.250769 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.251093 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jfg5l"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.251438 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.251879 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.252740 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.252821 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.252970 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.253014 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.252734 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.253473 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.269931 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.270044 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.269932 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.270113 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.270225 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.270283 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.270445 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.270616 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.271107 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.273268 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.285023 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.285696 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-spstc"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.286045 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.286541 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tp8nm"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.286867 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.287254 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.292413 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9g4m7"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.292888 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-swd9j"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.293211 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.293286 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.293743 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.294460 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.301523 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.301722 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.301921 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.302016 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.303081 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.303937 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.304074 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.304697 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.305379 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.305509 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.305973 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.306070 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rxrcb"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.306453 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307009 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307098 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307197 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307363 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307472 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307520 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307550 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307564 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307648 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307672 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307683 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307728 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.307738 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.309363 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.310825 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.311285 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.311557 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bglks"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.311926 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.312035 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.312223 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.312317 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.315128 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.315609 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.317012 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.317141 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.317641 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.317860 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.318087 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9f7tt"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.318310 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.318436 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.318592 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.318715 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.319012 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.320077 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.320274 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.320277 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.320459 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.321256 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.346044 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351208 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-metrics-certs\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351276 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-dir\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-default-certificate\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351344 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351372 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/70ab8e46-495c-4e0e-8896-e4eac8d855b6-signing-cabundle\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351404 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzj94\" (UniqueName: \"kubernetes.io/projected/1532650f-e94e-4f91-b79c-a0da8e727893-kube-api-access-gzj94\") pod \"migrator-59844c95c7-nrv4l\" (UID: \"1532650f-e94e-4f91-b79c-a0da8e727893\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351436 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/70ab8e46-495c-4e0e-8896-e4eac8d855b6-signing-key\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351475 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx9mj\" (UniqueName: \"kubernetes.io/projected/c42a0759-cde0-4ecd-b99b-d6dcb74be0c5-kube-api-access-lx9mj\") pod \"multus-admission-controller-857f4d67dd-jfg5l\" (UID: \"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351504 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351542 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r86g8\" (UniqueName: \"kubernetes.io/projected/d27f831c-bcde-4937-b3c6-0ada6053a268-kube-api-access-r86g8\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351575 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351607 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-config\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351633 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351659 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-encryption-config\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351684 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351716 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351748 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-trusted-ca\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351769 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-etcd-serving-ca\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351795 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8efcb2-480a-4577-af47-304c07876b28-service-ca-bundle\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351822 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351873 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk2b7\" (UniqueName: \"kubernetes.io/projected/5a8efcb2-480a-4577-af47-304c07876b28-kube-api-access-xk2b7\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351896 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c317e220-5d51-4a28-91c9-93ef65b36e92-apiservice-cert\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351921 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8rxx\" (UniqueName: \"kubernetes.io/projected/416bbcbb-fc9f-489b-a1ac-eea205ce9152-kube-api-access-p8rxx\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351948 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d27f831c-bcde-4937-b3c6-0ada6053a268-node-pullsecrets\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351935 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351995 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa4fac0-c895-4779-9eb9-11724f97ccd8-proxy-tls\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352027 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e72ff91-36b3-4098-b423-ad103f61541e-config\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c42a0759-cde0-4ecd-b99b-d6dcb74be0c5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jfg5l\" (UID: \"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352087 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-audit\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352112 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-policies\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352143 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7tz\" (UniqueName: \"kubernetes.io/projected/eaa4fac0-c895-4779-9eb9-11724f97ccd8-kube-api-access-lx7tz\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352174 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-config\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352203 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhnp\" (UniqueName: \"kubernetes.io/projected/7a98149e-16f6-4129-a488-393bdbf90da3-kube-api-access-kdhnp\") pod \"dns-operator-744455d44c-26pm4\" (UID: \"7a98149e-16f6-4129-a488-393bdbf90da3\") " pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352234 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352261 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a98149e-16f6-4129-a488-393bdbf90da3-metrics-tls\") pod \"dns-operator-744455d44c-26pm4\" (UID: \"7a98149e-16f6-4129-a488-393bdbf90da3\") " pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352292 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-serving-cert\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352324 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c317e220-5d51-4a28-91c9-93ef65b36e92-webhook-cert\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352351 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352374 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eaa4fac0-c895-4779-9eb9-11724f97ccd8-images\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352402 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-stats-auth\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352436 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b405ab-f0aa-4961-a0e4-c104e2040917-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352466 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352526 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352556 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eaa4fac0-c895-4779-9eb9-11724f97ccd8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352585 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c317e220-5d51-4a28-91c9-93ef65b36e92-tmpfs\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352612 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-serving-cert\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352637 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-image-import-ca\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352666 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98tdj\" (UniqueName: \"kubernetes.io/projected/70ab8e46-495c-4e0e-8896-e4eac8d855b6-kube-api-access-98tdj\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352698 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-etcd-client\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.352732 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b405ab-f0aa-4961-a0e4-c104e2040917-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.353092 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6q7kk"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.351472 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.356434 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d27f831c-bcde-4937-b3c6-0ada6053a268-audit-dir\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.356509 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6t4\" (UniqueName: \"kubernetes.io/projected/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-kube-api-access-2q6t4\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.356561 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e72ff91-36b3-4098-b423-ad103f61541e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.356595 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e72ff91-36b3-4098-b423-ad103f61541e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.356630 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bnxn\" (UniqueName: \"kubernetes.io/projected/c317e220-5d51-4a28-91c9-93ef65b36e92-kube-api-access-9bnxn\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.356676 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtvn\" (UniqueName: \"kubernetes.io/projected/a7b405ab-f0aa-4961-a0e4-c104e2040917-kube-api-access-trtvn\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.364122 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.364141 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.365283 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.366646 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.366974 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.367349 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.367834 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.368152 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.368345 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.368456 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.368572 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.368878 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.369499 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.369660 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-txbzn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.369780 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.369870 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.369895 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.373086 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.373257 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.373384 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.373668 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.373835 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.373880 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.374027 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.376652 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.382798 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.385986 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.386861 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.387198 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.387374 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.387461 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.387550 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.387636 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.388328 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.388666 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.389373 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.391075 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.391409 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.391509 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-g9fjl"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.391545 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.391837 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392032 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392196 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392306 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392359 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392518 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392731 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.392904 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.393040 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.393050 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.393100 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.393194 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.393780 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.394968 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.395584 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.396060 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.396210 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.396383 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.398074 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.398131 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.403206 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.404057 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.404267 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.405509 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d2tvd"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.406481 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.409818 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-64ctn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.410973 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.411513 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.412182 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.413633 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwjfj"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.414426 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.414803 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.415918 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.417085 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.417502 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dzllj"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.433015 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.435746 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.436191 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4khgc"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.437209 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.437241 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jfg5l"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.437346 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.438295 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.440836 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-spstc"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.441323 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9g4m7"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.442862 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.444175 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.445509 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-26pm4"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.447688 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.450141 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.451428 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.457615 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9f7tt"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458585 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk2b7\" (UniqueName: \"kubernetes.io/projected/5a8efcb2-480a-4577-af47-304c07876b28-kube-api-access-xk2b7\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458634 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c317e220-5d51-4a28-91c9-93ef65b36e92-apiservice-cert\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458664 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8rxx\" (UniqueName: \"kubernetes.io/projected/416bbcbb-fc9f-489b-a1ac-eea205ce9152-kube-api-access-p8rxx\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458705 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d27f831c-bcde-4937-b3c6-0ada6053a268-node-pullsecrets\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458730 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c42a0759-cde0-4ecd-b99b-d6dcb74be0c5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jfg5l\" (UID: \"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458754 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-audit\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458780 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa4fac0-c895-4779-9eb9-11724f97ccd8-proxy-tls\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458798 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e72ff91-36b3-4098-b423-ad103f61541e-config\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458861 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-client-ca\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-policies\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458919 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7tz\" (UniqueName: \"kubernetes.io/projected/eaa4fac0-c895-4779-9eb9-11724f97ccd8-kube-api-access-lx7tz\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458943 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-config\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.458972 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhnp\" (UniqueName: \"kubernetes.io/projected/7a98149e-16f6-4129-a488-393bdbf90da3-kube-api-access-kdhnp\") pod \"dns-operator-744455d44c-26pm4\" (UID: \"7a98149e-16f6-4129-a488-393bdbf90da3\") " pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459006 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-serving-cert\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459034 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c317e220-5d51-4a28-91c9-93ef65b36e92-webhook-cert\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459060 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459079 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459103 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a98149e-16f6-4129-a488-393bdbf90da3-metrics-tls\") pod \"dns-operator-744455d44c-26pm4\" (UID: \"7a98149e-16f6-4129-a488-393bdbf90da3\") " pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459132 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eaa4fac0-c895-4779-9eb9-11724f97ccd8-images\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459159 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-stats-auth\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459180 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b405ab-f0aa-4961-a0e4-c104e2040917-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459213 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eaa4fac0-c895-4779-9eb9-11724f97ccd8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459264 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c317e220-5d51-4a28-91c9-93ef65b36e92-tmpfs\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459335 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-serving-cert\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459442 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459471 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea105b82-fec3-4f5d-b056-fdd566619645-serving-cert\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459498 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-image-import-ca\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459520 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98tdj\" (UniqueName: \"kubernetes.io/projected/70ab8e46-495c-4e0e-8896-e4eac8d855b6-kube-api-access-98tdj\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459543 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-etcd-client\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459569 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b405ab-f0aa-4961-a0e4-c104e2040917-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459598 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d27f831c-bcde-4937-b3c6-0ada6053a268-audit-dir\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459621 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6t4\" (UniqueName: \"kubernetes.io/projected/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-kube-api-access-2q6t4\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459645 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e72ff91-36b3-4098-b423-ad103f61541e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e72ff91-36b3-4098-b423-ad103f61541e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459694 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bnxn\" (UniqueName: \"kubernetes.io/projected/c317e220-5d51-4a28-91c9-93ef65b36e92-kube-api-access-9bnxn\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459721 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trtvn\" (UniqueName: \"kubernetes.io/projected/a7b405ab-f0aa-4961-a0e4-c104e2040917-kube-api-access-trtvn\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459737 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-metrics-certs\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459758 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-dir\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459777 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-default-certificate\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459800 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459822 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/70ab8e46-495c-4e0e-8896-e4eac8d855b6-signing-key\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459876 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/70ab8e46-495c-4e0e-8896-e4eac8d855b6-signing-cabundle\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459901 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzj94\" (UniqueName: \"kubernetes.io/projected/1532650f-e94e-4f91-b79c-a0da8e727893-kube-api-access-gzj94\") pod \"migrator-59844c95c7-nrv4l\" (UID: \"1532650f-e94e-4f91-b79c-a0da8e727893\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459937 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx9mj\" (UniqueName: \"kubernetes.io/projected/c42a0759-cde0-4ecd-b99b-d6dcb74be0c5-kube-api-access-lx9mj\") pod \"multus-admission-controller-857f4d67dd-jfg5l\" (UID: \"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459966 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.459996 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r86g8\" (UniqueName: \"kubernetes.io/projected/d27f831c-bcde-4937-b3c6-0ada6053a268-kube-api-access-r86g8\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460029 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460063 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-config\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460115 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-encryption-config\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460163 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460190 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-config\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460215 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkkj9\" (UniqueName: \"kubernetes.io/projected/ea105b82-fec3-4f5d-b056-fdd566619645-kube-api-access-dkkj9\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460243 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460262 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-trusted-ca\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460287 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-etcd-serving-ca\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460561 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8efcb2-480a-4577-af47-304c07876b28-service-ca-bundle\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460588 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.460692 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eaa4fac0-c895-4779-9eb9-11724f97ccd8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.463777 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-config\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.463977 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d27f831c-bcde-4937-b3c6-0ada6053a268-node-pullsecrets\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.463968 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d27f831c-bcde-4937-b3c6-0ada6053a268-audit-dir\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.464171 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7b405ab-f0aa-4961-a0e4-c104e2040917-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.464627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c317e220-5d51-4a28-91c9-93ef65b36e92-tmpfs\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.466488 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-config\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.467556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.467635 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e72ff91-36b3-4098-b423-ad103f61541e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.470963 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-serving-cert\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.472551 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-image-import-ca\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.475654 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.476254 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.478641 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/70ab8e46-495c-4e0e-8896-e4eac8d855b6-signing-cabundle\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.479588 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.479688 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-serving-cert\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.480432 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eaa4fac0-c895-4779-9eb9-11724f97ccd8-images\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.481446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a98149e-16f6-4129-a488-393bdbf90da3-metrics-tls\") pod \"dns-operator-744455d44c-26pm4\" (UID: \"7a98149e-16f6-4129-a488-393bdbf90da3\") " pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.481868 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-dir\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.482642 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c317e220-5d51-4a28-91c9-93ef65b36e92-apiservice-cert\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.483701 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c42a0759-cde0-4ecd-b99b-d6dcb74be0c5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jfg5l\" (UID: \"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.483776 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6q7kk"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.483816 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-g9fjl"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.484182 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.484447 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-trusted-ca-bundle\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.484503 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c317e220-5d51-4a28-91c9-93ef65b36e92-webhook-cert\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.485304 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-policies\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.485473 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-default-certificate\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.485491 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e72ff91-36b3-4098-b423-ad103f61541e-config\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.485537 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-etcd-serving-ca\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.486122 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.486507 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d27f831c-bcde-4937-b3c6-0ada6053a268-audit\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.486904 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a8efcb2-480a-4577-af47-304c07876b28-service-ca-bundle\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.487089 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.487336 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-trusted-ca\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.487456 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.488326 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.488223 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.489114 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-stats-auth\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.490048 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/70ab8e46-495c-4e0e-8896-e4eac8d855b6-signing-key\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.490625 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eaa4fac0-c895-4779-9eb9-11724f97ccd8-proxy-tls\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.490632 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5a8efcb2-480a-4577-af47-304c07876b28-metrics-certs\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.491093 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7b405ab-f0aa-4961-a0e4-c104e2040917-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.492004 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-encryption-config\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.492064 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.493988 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d27f831c-bcde-4937-b3c6-0ada6053a268-etcd-client\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.494578 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.495376 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.505084 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.506461 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.519950 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.520192 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bglks"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.522543 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.526051 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.527615 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.528936 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tp8nm"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.530041 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.531535 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.532135 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rxrcb"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.533724 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rhcmn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.534258 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.535110 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pxgcs"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.535385 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.536774 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.537071 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.537111 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.538165 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.539211 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.540308 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.541491 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d2tvd"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.542656 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rhcmn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.543941 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pxgcs"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.546200 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwjfj"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.546644 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.547983 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-64ctn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.549029 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4khgc"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.550129 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-txbzn"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.551316 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.552670 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4z6qs"] Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.553544 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.555392 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.561461 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-config\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.561493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkkj9\" (UniqueName: \"kubernetes.io/projected/ea105b82-fec3-4f5d-b056-fdd566619645-kube-api-access-dkkj9\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.561544 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-client-ca\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.561584 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea105b82-fec3-4f5d-b056-fdd566619645-serving-cert\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.574954 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.595317 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.621411 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.638197 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.660172 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.680745 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.695125 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.715490 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.734834 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.760660 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.775345 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.795759 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.815513 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.836689 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.858029 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.895336 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.915887 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.934892 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.954972 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.982538 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 10:01:46 crc kubenswrapper[4958]: I1201 10:01:46.996129 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.015507 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.034528 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.054358 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.075417 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.094492 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.114769 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.135709 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.156495 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.175170 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.195412 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.216013 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.235344 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.254933 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.274577 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.300816 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.314382 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.335009 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.355664 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.374624 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.384259 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-config\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.394924 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.413312 4958 request.go:700] Waited for 1.014713544s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.416024 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.429073 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea105b82-fec3-4f5d-b056-fdd566619645-serving-cert\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.436013 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.443432 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-client-ca\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.455171 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.475157 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.495111 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.516206 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.535305 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.555688 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.574921 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.594720 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.615295 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.635613 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.654564 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.675416 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.696385 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.716021 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.734489 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.755321 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.776264 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.795065 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.815916 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.835110 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.853967 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.875787 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.895582 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.915162 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.968343 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.974216 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 10:01:47 crc kubenswrapper[4958]: I1201 10:01:47.994910 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.014412 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.034752 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.054121 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.075155 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.094338 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.136517 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk2b7\" (UniqueName: \"kubernetes.io/projected/5a8efcb2-480a-4577-af47-304c07876b28-kube-api-access-xk2b7\") pod \"router-default-5444994796-swd9j\" (UID: \"5a8efcb2-480a-4577-af47-304c07876b28\") " pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.153240 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzj94\" (UniqueName: \"kubernetes.io/projected/1532650f-e94e-4f91-b79c-a0da8e727893-kube-api-access-gzj94\") pod \"migrator-59844c95c7-nrv4l\" (UID: \"1532650f-e94e-4f91-b79c-a0da8e727893\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.174585 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e72ff91-36b3-4098-b423-ad103f61541e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wxxsg\" (UID: \"4e72ff91-36b3-4098-b423-ad103f61541e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.195120 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8rxx\" (UniqueName: \"kubernetes.io/projected/416bbcbb-fc9f-489b-a1ac-eea205ce9152-kube-api-access-p8rxx\") pod \"oauth-openshift-558db77b4-tp8nm\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.212696 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6t4\" (UniqueName: \"kubernetes.io/projected/dc698dd9-e0de-4b99-ba98-03a8b1ec3c42-kube-api-access-2q6t4\") pod \"console-operator-58897d9998-9g4m7\" (UID: \"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42\") " pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.235672 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx9mj\" (UniqueName: \"kubernetes.io/projected/c42a0759-cde0-4ecd-b99b-d6dcb74be0c5-kube-api-access-lx9mj\") pod \"multus-admission-controller-857f4d67dd-jfg5l\" (UID: \"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.240547 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.246401 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.249597 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r86g8\" (UniqueName: \"kubernetes.io/projected/d27f831c-bcde-4937-b3c6-0ada6053a268-kube-api-access-r86g8\") pod \"apiserver-76f77b778f-dzllj\" (UID: \"d27f831c-bcde-4937-b3c6-0ada6053a268\") " pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.261178 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.270977 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhnp\" (UniqueName: \"kubernetes.io/projected/7a98149e-16f6-4129-a488-393bdbf90da3-kube-api-access-kdhnp\") pod \"dns-operator-744455d44c-26pm4\" (UID: \"7a98149e-16f6-4129-a488-393bdbf90da3\") " pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.296990 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98tdj\" (UniqueName: \"kubernetes.io/projected/70ab8e46-495c-4e0e-8896-e4eac8d855b6-kube-api-access-98tdj\") pod \"service-ca-9c57cc56f-spstc\" (UID: \"70ab8e46-495c-4e0e-8896-e4eac8d855b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.312643 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bnxn\" (UniqueName: \"kubernetes.io/projected/c317e220-5d51-4a28-91c9-93ef65b36e92-kube-api-access-9bnxn\") pod \"packageserver-d55dfcdfc-5t5rz\" (UID: \"c317e220-5d51-4a28-91c9-93ef65b36e92\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.331835 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.332327 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trtvn\" (UniqueName: \"kubernetes.io/projected/a7b405ab-f0aa-4961-a0e4-c104e2040917-kube-api-access-trtvn\") pod \"kube-storage-version-migrator-operator-b67b599dd-hpjfh\" (UID: \"a7b405ab-f0aa-4961-a0e4-c104e2040917\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.339021 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.350761 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.355541 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.355971 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7tz\" (UniqueName: \"kubernetes.io/projected/eaa4fac0-c895-4779-9eb9-11724f97ccd8-kube-api-access-lx7tz\") pod \"machine-config-operator-74547568cd-nzgvd\" (UID: \"eaa4fac0-c895-4779-9eb9-11724f97ccd8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.369739 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.376372 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.392180 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.396316 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.411093 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.419354 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.424248 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.433083 4958 request.go:700] Waited for 1.895556058s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.435333 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.455828 4958 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.476242 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.487076 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.495947 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.515467 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.530220 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-spstc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.537686 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.592545 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tp8nm"] Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.597273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkkj9\" (UniqueName: \"kubernetes.io/projected/ea105b82-fec3-4f5d-b056-fdd566619645-kube-api-access-dkkj9\") pod \"route-controller-manager-6576b87f9c-46smf\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.597763 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9g4m7"] Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.679272 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg"] Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690715 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-config\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690775 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7mv9\" (UniqueName: \"kubernetes.io/projected/c7fc63ed-b5b8-4854-8d5a-1cab817a3581-kube-api-access-z7mv9\") pod \"control-plane-machine-set-operator-78cbb6b69f-sv8bf\" (UID: \"c7fc63ed-b5b8-4854-8d5a-1cab817a3581\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690802 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690833 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142b2c22-6df3-4540-8ad7-748e000f3ec9-config\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690923 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-config\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690942 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-service-ca-bundle\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.690961 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.691355 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.691397 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3810945e-7192-4204-a89e-5fad8e22c611-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.691417 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-246n5\" (UniqueName: \"kubernetes.io/projected/81ffe673-ed64-4dd2-81c3-65c922af39bc-kube-api-access-246n5\") pod \"downloads-7954f5f757-g9fjl\" (UID: \"81ffe673-ed64-4dd2-81c3-65c922af39bc\") " pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.691598 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.691667 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-oauth-serving-cert\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: E1201 10:01:48.691861 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.19182607 +0000 UTC m=+156.700615107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692073 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-audit-policies\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692144 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152c0d38-b359-479b-bfd6-cb019847d7fa-serving-cert\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692337 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6f17185-a290-4d91-9337-cd0954cf9c66-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692440 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692492 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xd2b\" (UniqueName: \"kubernetes.io/projected/9ee9be51-cd77-41e9-9db3-ab6a64015288-kube-api-access-4xd2b\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692513 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnf5\" (UniqueName: \"kubernetes.io/projected/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-kube-api-access-2tnf5\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692645 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qdcl\" (UniqueName: \"kubernetes.io/projected/62f08690-7b52-4f87-a32d-e33b10590cdc-kube-api-access-5qdcl\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692740 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f17185-a290-4d91-9337-cd0954cf9c66-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692799 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f08690-7b52-4f87-a32d-e33b10590cdc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692864 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-config\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.692904 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ed7b0bb-41a7-49dc-93a6-292040efb53b-serving-cert\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693006 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-etcd-client\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693168 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-bound-sa-token\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693222 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-serving-cert\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693245 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693672 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90fd93e3-8621-438c-9914-fe4a1be62b8d-srv-cert\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693703 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-service-ca\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.693726 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsk2\" (UniqueName: \"kubernetes.io/projected/c12352eb-31e7-465d-b743-9d3ee6093b97-kube-api-access-kbsk2\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.694766 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3810945e-7192-4204-a89e-5fad8e22c611-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695196 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695220 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695262 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-client-ca\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695284 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-config\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695308 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed7b0bb-41a7-49dc-93a6-292040efb53b-config\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695365 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22zjt\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-kube-api-access-22zjt\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695395 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9z9\" (UniqueName: \"kubernetes.io/projected/6723ac99-b14b-4d99-90fa-9dd395093f3b-kube-api-access-hc9z9\") pod \"package-server-manager-789f6589d5-q7r9c\" (UID: \"6723ac99-b14b-4d99-90fa-9dd395093f3b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695412 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrzx\" (UniqueName: \"kubernetes.io/projected/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-kube-api-access-zxrzx\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695429 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-trusted-ca\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695456 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9sz2\" (UniqueName: \"kubernetes.io/projected/fc53c080-c843-4682-a5b2-dd571b28000c-kube-api-access-l9sz2\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695474 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgch\" (UniqueName: \"kubernetes.io/projected/5418dd0e-243d-4573-94c9-a1948dca9b9f-kube-api-access-jvgch\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695516 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-certificates\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.695538 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-audit-dir\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.696097 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5418dd0e-243d-4573-94c9-a1948dca9b9f-srv-cert\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.696145 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5418dd0e-243d-4573-94c9-a1948dca9b9f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.697299 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-service-ca\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.697374 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c12352eb-31e7-465d-b743-9d3ee6093b97-serving-cert\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.697443 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.697469 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc53c080-c843-4682-a5b2-dd571b28000c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698152 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62649f01-3312-4309-9c30-05090c2655eb-machine-approver-tls\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698186 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-serving-cert\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698222 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7z78\" (UniqueName: \"kubernetes.io/projected/478689fc-5c07-45bc-ab87-8c58e68c348b-kube-api-access-g7z78\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698259 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90fd93e3-8621-438c-9914-fe4a1be62b8d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698284 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6723ac99-b14b-4d99-90fa-9dd395093f3b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q7r9c\" (UID: \"6723ac99-b14b-4d99-90fa-9dd395093f3b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698343 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-client\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698433 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698465 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62649f01-3312-4309-9c30-05090c2655eb-auth-proxy-config\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698507 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7fc63ed-b5b8-4854-8d5a-1cab817a3581-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sv8bf\" (UID: \"c7fc63ed-b5b8-4854-8d5a-1cab817a3581\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698534 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62649f01-3312-4309-9c30-05090c2655eb-config\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698596 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-oauth-config\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.698631 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff847\" (UniqueName: \"kubernetes.io/projected/954a15b1-bcae-44b9-ad5b-e004ed4d79be-kube-api-access-ff847\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.699172 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc2sm\" (UniqueName: \"kubernetes.io/projected/62649f01-3312-4309-9c30-05090c2655eb-kube-api-access-vc2sm\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.699468 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-ca\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700027 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfnh4\" (UniqueName: \"kubernetes.io/projected/90fd93e3-8621-438c-9914-fe4a1be62b8d-kube-api-access-lfnh4\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700141 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700278 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954a15b1-bcae-44b9-ad5b-e004ed4d79be-secret-volume\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700383 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc53c080-c843-4682-a5b2-dd571b28000c-proxy-tls\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700475 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-tls\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvqb2\" (UniqueName: \"kubernetes.io/projected/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-kube-api-access-tvqb2\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700595 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/142b2c22-6df3-4540-8ad7-748e000f3ec9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.700816 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fe0fe33-c2ce-4622-b6f4-3f587718b006-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701094 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk2nl\" (UniqueName: \"kubernetes.io/projected/9ed7b0bb-41a7-49dc-93a6-292040efb53b-kube-api-access-nk2nl\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701128 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701632 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khxm4\" (UniqueName: \"kubernetes.io/projected/1fd46f3c-80b3-4881-ae13-263db886a0e6-kube-api-access-khxm4\") pod \"cluster-samples-operator-665b6dd947-xkxmd\" (UID: \"1fd46f3c-80b3-4881-ae13-263db886a0e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701699 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f17185-a290-4d91-9337-cd0954cf9c66-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701795 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-encryption-config\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701820 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-config\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701882 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478689fc-5c07-45bc-ab87-8c58e68c348b-serving-cert\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.701928 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3810945e-7192-4204-a89e-5fad8e22c611-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.702110 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fd46f3c-80b3-4881-ae13-263db886a0e6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xkxmd\" (UID: \"1fd46f3c-80b3-4881-ae13-263db886a0e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.702142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9fnw\" (UniqueName: \"kubernetes.io/projected/fc711edd-1026-4f47-ab7f-92bd6e1fb964-kube-api-access-v9fnw\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.702201 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/142b2c22-6df3-4540-8ad7-748e000f3ec9-images\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.702942 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703074 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954a15b1-bcae-44b9-ad5b-e004ed4d79be-config-volume\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703120 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45mzm\" (UniqueName: \"kubernetes.io/projected/142b2c22-6df3-4540-8ad7-748e000f3ec9-kube-api-access-45mzm\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fe0fe33-c2ce-4622-b6f4-3f587718b006-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703179 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f08690-7b52-4f87-a32d-e33b10590cdc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703229 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-trusted-ca-bundle\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703259 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqf8\" (UniqueName: \"kubernetes.io/projected/152c0d38-b359-479b-bfd6-cb019847d7fa-kube-api-access-djqf8\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.703302 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngf7\" (UniqueName: \"kubernetes.io/projected/b6f17185-a290-4d91-9337-cd0954cf9c66-kube-api-access-6ngf7\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.728283 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh"] Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.804771 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805508 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-config\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805566 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7mv9\" (UniqueName: \"kubernetes.io/projected/c7fc63ed-b5b8-4854-8d5a-1cab817a3581-kube-api-access-z7mv9\") pod \"control-plane-machine-set-operator-78cbb6b69f-sv8bf\" (UID: \"c7fc63ed-b5b8-4854-8d5a-1cab817a3581\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805602 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: E1201 10:01:48.805648 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.305607475 +0000 UTC m=+156.814396512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805712 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-config\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805770 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-service-ca-bundle\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805797 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142b2c22-6df3-4540-8ad7-748e000f3ec9-config\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805838 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805900 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3810945e-7192-4204-a89e-5fad8e22c611-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805932 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-246n5\" (UniqueName: \"kubernetes.io/projected/81ffe673-ed64-4dd2-81c3-65c922af39bc-kube-api-access-246n5\") pod \"downloads-7954f5f757-g9fjl\" (UID: \"81ffe673-ed64-4dd2-81c3-65c922af39bc\") " pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805958 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.805981 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806007 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-oauth-serving-cert\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152c0d38-b359-479b-bfd6-cb019847d7fa-serving-cert\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806058 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6f17185-a290-4d91-9337-cd0954cf9c66-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806091 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-audit-policies\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806116 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806137 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xd2b\" (UniqueName: \"kubernetes.io/projected/9ee9be51-cd77-41e9-9db3-ab6a64015288-kube-api-access-4xd2b\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806166 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnf5\" (UniqueName: \"kubernetes.io/projected/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-kube-api-access-2tnf5\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806307 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qdcl\" (UniqueName: \"kubernetes.io/projected/62f08690-7b52-4f87-a32d-e33b10590cdc-kube-api-access-5qdcl\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806335 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f17185-a290-4d91-9337-cd0954cf9c66-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806362 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ed7b0bb-41a7-49dc-93a6-292040efb53b-serving-cert\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806399 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-registration-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806426 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f08690-7b52-4f87-a32d-e33b10590cdc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806450 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-config\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806469 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bksx\" (UniqueName: \"kubernetes.io/projected/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-kube-api-access-7bksx\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806635 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-etcd-client\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806669 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-bound-sa-token\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806692 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-serving-cert\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806716 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806746 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90fd93e3-8621-438c-9914-fe4a1be62b8d-srv-cert\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806773 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsk2\" (UniqueName: \"kubernetes.io/projected/c12352eb-31e7-465d-b743-9d3ee6093b97-kube-api-access-kbsk2\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806801 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-service-ca\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806836 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806834 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m46t\" (UniqueName: \"kubernetes.io/projected/1dcea1f2-dae7-4542-99bd-22275ec83616-kube-api-access-9m46t\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806947 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3810945e-7192-4204-a89e-5fad8e22c611-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116691d6-7237-438e-8adf-2678cbcbced0-cert\") pod \"ingress-canary-rhcmn\" (UID: \"116691d6-7237-438e-8adf-2678cbcbced0\") " pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807070 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807117 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd44k\" (UniqueName: \"kubernetes.io/projected/2881ba7f-a7a8-454b-a6e8-071573585183-kube-api-access-bd44k\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807143 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-client-ca\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807171 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22zjt\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-kube-api-access-22zjt\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807194 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9z9\" (UniqueName: \"kubernetes.io/projected/6723ac99-b14b-4d99-90fa-9dd395093f3b-kube-api-access-hc9z9\") pod \"package-server-manager-789f6589d5-q7r9c\" (UID: \"6723ac99-b14b-4d99-90fa-9dd395093f3b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807218 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-config\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807243 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed7b0bb-41a7-49dc-93a6-292040efb53b-config\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807270 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-csi-data-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807291 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrzx\" (UniqueName: \"kubernetes.io/projected/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-kube-api-access-zxrzx\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807365 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2881ba7f-a7a8-454b-a6e8-071573585183-certs\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807362 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-config\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807903 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-audit-policies\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808021 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.807388 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9sz2\" (UniqueName: \"kubernetes.io/projected/fc53c080-c843-4682-a5b2-dd571b28000c-kube-api-access-l9sz2\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808117 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-config\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.806682 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-config\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808689 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12352eb-31e7-465d-b743-9d3ee6093b97-service-ca-bundle\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808768 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-trusted-ca\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-certificates\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808828 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgch\" (UniqueName: \"kubernetes.io/projected/5418dd0e-243d-4573-94c9-a1948dca9b9f-kube-api-access-jvgch\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808926 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-audit-dir\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.808973 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-trusted-ca\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809010 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-socket-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809034 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww8n7\" (UniqueName: \"kubernetes.io/projected/90ee9175-d058-4b89-81c0-d7619908a777-kube-api-access-ww8n7\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809058 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5418dd0e-243d-4573-94c9-a1948dca9b9f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809083 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5418dd0e-243d-4573-94c9-a1948dca9b9f-srv-cert\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: E1201 10:01:48.809173 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.309153316 +0000 UTC m=+156.817942353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809236 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-service-ca\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809259 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c12352eb-31e7-465d-b743-9d3ee6093b97-serving-cert\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809283 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809293 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809308 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc53c080-c843-4682-a5b2-dd571b28000c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809336 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7z78\" (UniqueName: \"kubernetes.io/projected/478689fc-5c07-45bc-ab87-8c58e68c348b-kube-api-access-g7z78\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809361 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62649f01-3312-4309-9c30-05090c2655eb-machine-approver-tls\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809383 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-serving-cert\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809430 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90fd93e3-8621-438c-9914-fe4a1be62b8d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809461 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6723ac99-b14b-4d99-90fa-9dd395093f3b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q7r9c\" (UID: \"6723ac99-b14b-4d99-90fa-9dd395093f3b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809489 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-client\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809531 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809554 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809590 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62649f01-3312-4309-9c30-05090c2655eb-auth-proxy-config\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809628 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7fc63ed-b5b8-4854-8d5a-1cab817a3581-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sv8bf\" (UID: \"c7fc63ed-b5b8-4854-8d5a-1cab817a3581\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.809736 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed7b0bb-41a7-49dc-93a6-292040efb53b-config\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.810526 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3810945e-7192-4204-a89e-5fad8e22c611-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.810967 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.811998 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-oauth-serving-cert\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.813302 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6f17185-a290-4d91-9337-cd0954cf9c66-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.813443 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.813489 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-audit-dir\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.813811 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-service-ca\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.815742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-trusted-ca\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.816483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-config\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.816627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-client-ca\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.816920 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5418dd0e-243d-4573-94c9-a1948dca9b9f-srv-cert\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.816951 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-certificates\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.820831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-etcd-client\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.821961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62649f01-3312-4309-9c30-05090c2655eb-config\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.822032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-oauth-config\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.822077 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff847\" (UniqueName: \"kubernetes.io/projected/954a15b1-bcae-44b9-ad5b-e004ed4d79be-kube-api-access-ff847\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.822582 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62649f01-3312-4309-9c30-05090c2655eb-config\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.822609 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62649f01-3312-4309-9c30-05090c2655eb-auth-proxy-config\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823265 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc2sm\" (UniqueName: \"kubernetes.io/projected/62649f01-3312-4309-9c30-05090c2655eb-kube-api-access-vc2sm\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823497 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-ca\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823540 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfnh4\" (UniqueName: \"kubernetes.io/projected/90fd93e3-8621-438c-9914-fe4a1be62b8d-kube-api-access-lfnh4\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823567 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823693 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954a15b1-bcae-44b9-ad5b-e004ed4d79be-secret-volume\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823742 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2881ba7f-a7a8-454b-a6e8-071573585183-node-bootstrap-token\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823814 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc53c080-c843-4682-a5b2-dd571b28000c-proxy-tls\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823923 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1dcea1f2-dae7-4542-99bd-22275ec83616-metrics-tls\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.823960 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-tls\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.824092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvqb2\" (UniqueName: \"kubernetes.io/projected/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-kube-api-access-tvqb2\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.824138 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/142b2c22-6df3-4540-8ad7-748e000f3ec9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.824183 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fe0fe33-c2ce-4622-b6f4-3f587718b006-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.825222 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.826820 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.827245 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142b2c22-6df3-4540-8ad7-748e000f3ec9-config\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.827796 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5418dd0e-243d-4573-94c9-a1948dca9b9f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.828901 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-tls\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.829121 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-client\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.829332 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f08690-7b52-4f87-a32d-e33b10590cdc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.829645 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.829788 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7fc63ed-b5b8-4854-8d5a-1cab817a3581-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sv8bf\" (UID: \"c7fc63ed-b5b8-4854-8d5a-1cab817a3581\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.830763 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-etcd-ca\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.836365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk2nl\" (UniqueName: \"kubernetes.io/projected/9ed7b0bb-41a7-49dc-93a6-292040efb53b-kube-api-access-nk2nl\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837037 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r752j\" (UniqueName: \"kubernetes.io/projected/116691d6-7237-438e-8adf-2678cbcbced0-kube-api-access-r752j\") pod \"ingress-canary-rhcmn\" (UID: \"116691d6-7237-438e-8adf-2678cbcbced0\") " pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837106 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-mountpoint-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837167 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-service-ca\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khxm4\" (UniqueName: \"kubernetes.io/projected/1fd46f3c-80b3-4881-ae13-263db886a0e6-kube-api-access-khxm4\") pod \"cluster-samples-operator-665b6dd947-xkxmd\" (UID: \"1fd46f3c-80b3-4881-ae13-263db886a0e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837256 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-encryption-config\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837327 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-config\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837359 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478689fc-5c07-45bc-ab87-8c58e68c348b-serving-cert\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837383 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f17185-a290-4d91-9337-cd0954cf9c66-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837464 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fd46f3c-80b3-4881-ae13-263db886a0e6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xkxmd\" (UID: \"1fd46f3c-80b3-4881-ae13-263db886a0e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9fnw\" (UniqueName: \"kubernetes.io/projected/fc711edd-1026-4f47-ab7f-92bd6e1fb964-kube-api-access-v9fnw\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837665 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3810945e-7192-4204-a89e-5fad8e22c611-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.837696 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-plugins-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.838960 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90fd93e3-8621-438c-9914-fe4a1be62b8d-srv-cert\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.839226 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc53c080-c843-4682-a5b2-dd571b28000c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.841803 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62649f01-3312-4309-9c30-05090c2655eb-machine-approver-tls\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.841882 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/142b2c22-6df3-4540-8ad7-748e000f3ec9-images\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.841976 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcea1f2-dae7-4542-99bd-22275ec83616-config-volume\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842073 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954a15b1-bcae-44b9-ad5b-e004ed4d79be-config-volume\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842108 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f08690-7b52-4f87-a32d-e33b10590cdc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842137 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-trusted-ca-bundle\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842173 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqf8\" (UniqueName: \"kubernetes.io/projected/152c0d38-b359-479b-bfd6-cb019847d7fa-kube-api-access-djqf8\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842465 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/142b2c22-6df3-4540-8ad7-748e000f3ec9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842616 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-serving-cert\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842762 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.842962 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/142b2c22-6df3-4540-8ad7-748e000f3ec9-images\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.843090 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-serving-cert\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.843202 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152c0d38-b359-479b-bfd6-cb019847d7fa-serving-cert\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.843241 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ed7b0bb-41a7-49dc-93a6-292040efb53b-serving-cert\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.843318 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngf7\" (UniqueName: \"kubernetes.io/projected/b6f17185-a290-4d91-9337-cd0954cf9c66-kube-api-access-6ngf7\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.843364 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45mzm\" (UniqueName: \"kubernetes.io/projected/142b2c22-6df3-4540-8ad7-748e000f3ec9-kube-api-access-45mzm\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.843463 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-metrics-tls\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.844622 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c12352eb-31e7-465d-b743-9d3ee6093b97-serving-cert\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.844641 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-oauth-config\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.844668 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6f17185-a290-4d91-9337-cd0954cf9c66-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.844735 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fe0fe33-c2ce-4622-b6f4-3f587718b006-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.844744 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954a15b1-bcae-44b9-ad5b-e004ed4d79be-config-volume\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.844858 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f08690-7b52-4f87-a32d-e33b10590cdc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.845451 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152c0d38-b359-479b-bfd6-cb019847d7fa-config\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.845686 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fe0fe33-c2ce-4622-b6f4-3f587718b006-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.845980 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fd46f3c-80b3-4881-ae13-263db886a0e6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xkxmd\" (UID: \"1fd46f3c-80b3-4881-ae13-263db886a0e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.845980 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.846712 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954a15b1-bcae-44b9-ad5b-e004ed4d79be-secret-volume\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.848290 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6723ac99-b14b-4d99-90fa-9dd395093f3b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q7r9c\" (UID: \"6723ac99-b14b-4d99-90fa-9dd395093f3b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.850433 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3810945e-7192-4204-a89e-5fad8e22c611-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.853273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fe0fe33-c2ce-4622-b6f4-3f587718b006-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.854220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478689fc-5c07-45bc-ab87-8c58e68c348b-serving-cert\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.855008 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc53c080-c843-4682-a5b2-dd571b28000c-proxy-tls\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.856137 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-trusted-ca-bundle\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.857670 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-encryption-config\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.857761 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90fd93e3-8621-438c-9914-fe4a1be62b8d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.858402 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-246n5\" (UniqueName: \"kubernetes.io/projected/81ffe673-ed64-4dd2-81c3-65c922af39bc-kube-api-access-246n5\") pod \"downloads-7954f5f757-g9fjl\" (UID: \"81ffe673-ed64-4dd2-81c3-65c922af39bc\") " pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.863641 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.873884 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.880574 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qdcl\" (UniqueName: \"kubernetes.io/projected/62f08690-7b52-4f87-a32d-e33b10590cdc-kube-api-access-5qdcl\") pod \"openshift-apiserver-operator-796bbdcf4f-tlnwn\" (UID: \"62f08690-7b52-4f87-a32d-e33b10590cdc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.892622 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnf5\" (UniqueName: \"kubernetes.io/projected/db83732a-f1a2-4daa-8bf9-89fa7eebec3d-kube-api-access-2tnf5\") pod \"openshift-controller-manager-operator-756b6f6bc6-mzqdq\" (UID: \"db83732a-f1a2-4daa-8bf9-89fa7eebec3d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.913677 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9sz2\" (UniqueName: \"kubernetes.io/projected/fc53c080-c843-4682-a5b2-dd571b28000c-kube-api-access-l9sz2\") pod \"machine-config-controller-84d6567774-qwcn5\" (UID: \"fc53c080-c843-4682-a5b2-dd571b28000c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.933474 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3810945e-7192-4204-a89e-5fad8e22c611-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zppvz\" (UID: \"3810945e-7192-4204-a89e-5fad8e22c611\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947334 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:48 crc kubenswrapper[4958]: E1201 10:01:48.947506 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.447482485 +0000 UTC m=+156.956271522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947562 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1dcea1f2-dae7-4542-99bd-22275ec83616-metrics-tls\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947589 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2881ba7f-a7a8-454b-a6e8-071573585183-node-bootstrap-token\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947628 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r752j\" (UniqueName: \"kubernetes.io/projected/116691d6-7237-438e-8adf-2678cbcbced0-kube-api-access-r752j\") pod \"ingress-canary-rhcmn\" (UID: \"116691d6-7237-438e-8adf-2678cbcbced0\") " pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947645 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-mountpoint-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947676 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-plugins-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947693 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcea1f2-dae7-4542-99bd-22275ec83616-config-volume\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947736 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-metrics-tls\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947779 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947818 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-registration-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947857 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bksx\" (UniqueName: \"kubernetes.io/projected/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-kube-api-access-7bksx\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947890 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m46t\" (UniqueName: \"kubernetes.io/projected/1dcea1f2-dae7-4542-99bd-22275ec83616-kube-api-access-9m46t\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947911 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116691d6-7237-438e-8adf-2678cbcbced0-cert\") pod \"ingress-canary-rhcmn\" (UID: \"116691d6-7237-438e-8adf-2678cbcbced0\") " pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.947929 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd44k\" (UniqueName: \"kubernetes.io/projected/2881ba7f-a7a8-454b-a6e8-071573585183-kube-api-access-bd44k\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.948102 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-csi-data-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.948150 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2881ba7f-a7a8-454b-a6e8-071573585183-certs\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.948193 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-trusted-ca\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.948222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-socket-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.948245 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww8n7\" (UniqueName: \"kubernetes.io/projected/90ee9175-d058-4b89-81c0-d7619908a777-kube-api-access-ww8n7\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.948307 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.950680 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-csi-data-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.950722 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-mountpoint-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.951031 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-plugins-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.951380 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dcea1f2-dae7-4542-99bd-22275ec83616-config-volume\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: E1201 10:01:48.951456 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.451442429 +0000 UTC m=+156.960231576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.952011 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-socket-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.952141 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/90ee9175-d058-4b89-81c0-d7619908a777-registration-dir\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.952483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1dcea1f2-dae7-4542-99bd-22275ec83616-metrics-tls\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.952565 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-trusted-ca\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.952681 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.953410 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsk2\" (UniqueName: \"kubernetes.io/projected/c12352eb-31e7-465d-b743-9d3ee6093b97-kube-api-access-kbsk2\") pod \"authentication-operator-69f744f599-txbzn\" (UID: \"c12352eb-31e7-465d-b743-9d3ee6093b97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.954516 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2881ba7f-a7a8-454b-a6e8-071573585183-node-bootstrap-token\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.956393 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-metrics-tls\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.956459 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116691d6-7237-438e-8adf-2678cbcbced0-cert\") pod \"ingress-canary-rhcmn\" (UID: \"116691d6-7237-438e-8adf-2678cbcbced0\") " pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.958715 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.968206 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2881ba7f-a7a8-454b-a6e8-071573585183-certs\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.974187 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd"] Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.983646 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7mv9\" (UniqueName: \"kubernetes.io/projected/c7fc63ed-b5b8-4854-8d5a-1cab817a3581-kube-api-access-z7mv9\") pod \"control-plane-machine-set-operator-78cbb6b69f-sv8bf\" (UID: \"c7fc63ed-b5b8-4854-8d5a-1cab817a3581\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.991656 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-spstc"] Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.996702 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-bound-sa-token\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:48 crc kubenswrapper[4958]: I1201 10:01:48.997472 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.001493 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.031260 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22zjt\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-kube-api-access-22zjt\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.037067 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" event={"ID":"a7b405ab-f0aa-4961-a0e4-c104e2040917","Type":"ContainerStarted","Data":"3733fbb490a79d0bfe2808084021d9d0c2a30f42f956c25ea46d18fd9658a2d9"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.037131 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" event={"ID":"a7b405ab-f0aa-4961-a0e4-c104e2040917","Type":"ContainerStarted","Data":"c025101da831900d120a145eb32c5017c949386b0260a7967e210625841ca15f"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.042701 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.049774 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.050533 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.550490595 +0000 UTC m=+157.059279632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.055565 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.068004 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-26pm4"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.070586 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.070719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.071380 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.571362525 +0000 UTC m=+157.080151552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.077214 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jfg5l"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.084132 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-dzllj"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.085679 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrzx\" (UniqueName: \"kubernetes.io/projected/a28acca5-cb70-46d6-a5a4-49f9dbc79c91-kube-api-access-zxrzx\") pod \"apiserver-7bbb656c7d-lrtgp\" (UID: \"a28acca5-cb70-46d6-a5a4-49f9dbc79c91\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.086979 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9z9\" (UniqueName: \"kubernetes.io/projected/6723ac99-b14b-4d99-90fa-9dd395093f3b-kube-api-access-hc9z9\") pod \"package-server-manager-789f6589d5-q7r9c\" (UID: \"6723ac99-b14b-4d99-90fa-9dd395093f3b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.089794 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xd2b\" (UniqueName: \"kubernetes.io/projected/9ee9be51-cd77-41e9-9db3-ab6a64015288-kube-api-access-4xd2b\") pod \"console-f9d7485db-rxrcb\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.091462 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" event={"ID":"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42","Type":"ContainerStarted","Data":"1d8ee8f7642cc4c2cf5b385e1d46da7ab5968027bb4aad2ca015a708f3dde4a7"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.091511 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" event={"ID":"dc698dd9-e0de-4b99-ba98-03a8b1ec3c42","Type":"ContainerStarted","Data":"e93562748df96706c0af0cbc8bc633760abdef4ee33ce8882e9315b0eeaba182"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.092142 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6f17185-a290-4d91-9337-cd0954cf9c66-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.092607 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.096079 4958 patch_prober.go:28] interesting pod/console-operator-58897d9998-9g4m7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.096163 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" podUID="dc698dd9-e0de-4b99-ba98-03a8b1ec3c42" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.109311 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" event={"ID":"416bbcbb-fc9f-489b-a1ac-eea205ce9152","Type":"ContainerStarted","Data":"d9efc234b49de98d5c3704b4fac9d42f47cbc666b0a3d8416e5d0bf392dc10ca"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.110572 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" Dec 01 10:01:49 crc kubenswrapper[4958]: W1201 10:01:49.112202 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42a0759_cde0_4ecd_b99b_d6dcb74be0c5.slice/crio-62c0dbb9c79008b679605f87827b255f0cc670f4e9228110bf6d48f8be42b433 WatchSource:0}: Error finding container 62c0dbb9c79008b679605f87827b255f0cc670f4e9228110bf6d48f8be42b433: Status 404 returned error can't find the container with id 62c0dbb9c79008b679605f87827b255f0cc670f4e9228110bf6d48f8be42b433 Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.112930 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" event={"ID":"4e72ff91-36b3-4098-b423-ad103f61541e","Type":"ContainerStarted","Data":"516ac27ebec4a5556db3ccc2c3d4fe94f18d06bccce64fa135c7f0cc6143613b"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.119401 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgch\" (UniqueName: \"kubernetes.io/projected/5418dd0e-243d-4573-94c9-a1948dca9b9f-kube-api-access-jvgch\") pod \"olm-operator-6b444d44fb-fmdxw\" (UID: \"5418dd0e-243d-4573-94c9-a1948dca9b9f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.128385 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-swd9j" event={"ID":"5a8efcb2-480a-4577-af47-304c07876b28","Type":"ContainerStarted","Data":"20dcd43b0221f0e5e22decef0056ec6cdb002803d520b910d1dfa30465262258"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.128452 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-swd9j" event={"ID":"5a8efcb2-480a-4577-af47-304c07876b28","Type":"ContainerStarted","Data":"a22f77f9da4d64cc36e9c71ee44b848e3bc2f68c25d798d3b37f30130a8282ea"} Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.136881 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.137690 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" event={"ID":"eaa4fac0-c895-4779-9eb9-11724f97ccd8","Type":"ContainerStarted","Data":"15c395bfc0ef0a36bcf536ce786dae811b774c5509f48ee00e21b7899b47e616"} Dec 01 10:01:49 crc kubenswrapper[4958]: W1201 10:01:49.149143 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc317e220_5d51_4a28_91c9_93ef65b36e92.slice/crio-4660f85ec7b917f8d815009b23b7f7c9a2c6c41bbf7c89125d8f07110a52796b WatchSource:0}: Error finding container 4660f85ec7b917f8d815009b23b7f7c9a2c6c41bbf7c89125d8f07110a52796b: Status 404 returned error can't find the container with id 4660f85ec7b917f8d815009b23b7f7c9a2c6c41bbf7c89125d8f07110a52796b Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.152909 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff847\" (UniqueName: \"kubernetes.io/projected/954a15b1-bcae-44b9-ad5b-e004ed4d79be-kube-api-access-ff847\") pod \"collect-profiles-29409720-84vv5\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.158674 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc2sm\" (UniqueName: \"kubernetes.io/projected/62649f01-3312-4309-9c30-05090c2655eb-kube-api-access-vc2sm\") pod \"machine-approver-56656f9798-lqpvf\" (UID: \"62649f01-3312-4309-9c30-05090c2655eb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.160173 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-g9fjl"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.175732 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.177190 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.677164881 +0000 UTC m=+157.185953918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.180774 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca27d3de-fa2e-4aa7-aef8-b9373742bddb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p6t9k\" (UID: \"ca27d3de-fa2e-4aa7-aef8-b9373742bddb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.184430 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.196350 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.206047 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfnh4\" (UniqueName: \"kubernetes.io/projected/90fd93e3-8621-438c-9914-fe4a1be62b8d-kube-api-access-lfnh4\") pod \"catalog-operator-68c6474976-xnmzx\" (UID: \"90fd93e3-8621-438c-9914-fe4a1be62b8d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.208724 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.239311 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvqb2\" (UniqueName: \"kubernetes.io/projected/f01bcd7f-0920-48a2-8b61-5e7eca60f0a8-kube-api-access-tvqb2\") pod \"openshift-config-operator-7777fb866f-bglks\" (UID: \"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.259574 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7z78\" (UniqueName: \"kubernetes.io/projected/478689fc-5c07-45bc-ab87-8c58e68c348b-kube-api-access-g7z78\") pod \"controller-manager-879f6c89f-9f7tt\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.264979 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.277898 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.279169 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.779151959 +0000 UTC m=+157.287941176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.289519 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.290924 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk2nl\" (UniqueName: \"kubernetes.io/projected/9ed7b0bb-41a7-49dc-93a6-292040efb53b-kube-api-access-nk2nl\") pod \"service-ca-operator-777779d784-8wnxg\" (UID: \"9ed7b0bb-41a7-49dc-93a6-292040efb53b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.299372 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.321042 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.321699 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khxm4\" (UniqueName: \"kubernetes.io/projected/1fd46f3c-80b3-4881-ae13-263db886a0e6-kube-api-access-khxm4\") pod \"cluster-samples-operator-665b6dd947-xkxmd\" (UID: \"1fd46f3c-80b3-4881-ae13-263db886a0e6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.322438 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9fnw\" (UniqueName: \"kubernetes.io/projected/fc711edd-1026-4f47-ab7f-92bd6e1fb964-kube-api-access-v9fnw\") pod \"marketplace-operator-79b997595-6q7kk\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.333444 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.337890 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.348995 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:49 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:49 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:49 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.349419 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.362921 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngf7\" (UniqueName: \"kubernetes.io/projected/b6f17185-a290-4d91-9337-cd0954cf9c66-kube-api-access-6ngf7\") pod \"cluster-image-registry-operator-dc59b4c8b-msxng\" (UID: \"b6f17185-a290-4d91-9337-cd0954cf9c66\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.363130 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqf8\" (UniqueName: \"kubernetes.io/projected/152c0d38-b359-479b-bfd6-cb019847d7fa-kube-api-access-djqf8\") pod \"etcd-operator-b45778765-d2tvd\" (UID: \"152c0d38-b359-479b-bfd6-cb019847d7fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.366444 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.369712 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.376388 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.378813 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.379474 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.879440784 +0000 UTC m=+157.388229821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.383279 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45mzm\" (UniqueName: \"kubernetes.io/projected/142b2c22-6df3-4540-8ad7-748e000f3ec9-kube-api-access-45mzm\") pod \"machine-api-operator-5694c8668f-64ctn\" (UID: \"142b2c22-6df3-4540-8ad7-748e000f3ec9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.388473 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.400560 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.412281 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.419936 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.422760 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.427935 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.452869 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bksx\" (UniqueName: \"kubernetes.io/projected/c0b89d3d-56c3-4092-8697-c8bd7ecbad7b-kube-api-access-7bksx\") pod \"ingress-operator-5b745b69d9-bbnnf\" (UID: \"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.458689 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.462818 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.467833 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m46t\" (UniqueName: \"kubernetes.io/projected/1dcea1f2-dae7-4542-99bd-22275ec83616-kube-api-access-9m46t\") pod \"dns-default-4khgc\" (UID: \"1dcea1f2-dae7-4542-99bd-22275ec83616\") " pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.468616 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r752j\" (UniqueName: \"kubernetes.io/projected/116691d6-7237-438e-8adf-2678cbcbced0-kube-api-access-r752j\") pod \"ingress-canary-rhcmn\" (UID: \"116691d6-7237-438e-8adf-2678cbcbced0\") " pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.480784 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd44k\" (UniqueName: \"kubernetes.io/projected/2881ba7f-a7a8-454b-a6e8-071573585183-kube-api-access-bd44k\") pod \"machine-config-server-4z6qs\" (UID: \"2881ba7f-a7a8-454b-a6e8-071573585183\") " pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.480995 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.481351 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:49.981337058 +0000 UTC m=+157.490126095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.495369 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww8n7\" (UniqueName: \"kubernetes.io/projected/90ee9175-d058-4b89-81c0-d7619908a777-kube-api-access-ww8n7\") pod \"csi-hostpathplugin-pxgcs\" (UID: \"90ee9175-d058-4b89-81c0-d7619908a777\") " pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.544396 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.545077 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.549261 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.561114 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhcmn" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.576064 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4z6qs" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.578240 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.583152 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.583368 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.083314245 +0000 UTC m=+157.592103282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.584658 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.585556 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.085535045 +0000 UTC m=+157.594324082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.614766 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn"] Dec 01 10:01:49 crc kubenswrapper[4958]: W1201 10:01:49.647613 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f08690_7b52_4f87_a32d_e33b10590cdc.slice/crio-98a6285496769d378eaee3d520cde64513cbd59c858bd95b2b4153d76257f691 WatchSource:0}: Error finding container 98a6285496769d378eaee3d520cde64513cbd59c858bd95b2b4153d76257f691: Status 404 returned error can't find the container with id 98a6285496769d378eaee3d520cde64513cbd59c858bd95b2b4153d76257f691 Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.686220 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.687214 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.187172301 +0000 UTC m=+157.695961338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.721354 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-txbzn"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.746175 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.788473 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.789416 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.789866 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.28983448 +0000 UTC m=+157.798623517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.890886 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:49 crc kubenswrapper[4958]: E1201 10:01:49.891425 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.391400084 +0000 UTC m=+157.900189141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.920699 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp"] Dec 01 10:01:49 crc kubenswrapper[4958]: I1201 10:01:49.991201 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-swd9j" podStartSLOduration=131.991178563 podStartE2EDuration="2m11.991178563s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:49.990268125 +0000 UTC m=+157.499057162" watchObservedRunningTime="2025-12-01 10:01:49.991178563 +0000 UTC m=+157.499967600" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.004483 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.005170 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.505143678 +0000 UTC m=+158.013932715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.087617 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rxrcb"] Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.105627 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.106151 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.606121644 +0000 UTC m=+158.114910681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.106226 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.106623 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.606607689 +0000 UTC m=+158.115396726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.131194 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw"] Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.178455 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" event={"ID":"1532650f-e94e-4f91-b79c-a0da8e727893","Type":"ContainerStarted","Data":"715de14a074f2ac02bb10de5ae555950d06d03e92f81b4402ef03dc7bd886811"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.178519 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" event={"ID":"1532650f-e94e-4f91-b79c-a0da8e727893","Type":"ContainerStarted","Data":"d1ac39b4992edac4e448db4f6422d410f88905798202e0f938427b99e8a4190c"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.202172 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-64ctn"] Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.212575 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" event={"ID":"4e72ff91-36b3-4098-b423-ad103f61541e","Type":"ContainerStarted","Data":"244371b6277664bfaf416f543361b276d614285ef0df7cc432b482a450e6f6bd"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.214184 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.214501 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.71448038 +0000 UTC m=+158.223269417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.215088 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.215643 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.715616546 +0000 UTC m=+158.224405593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.247739 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" event={"ID":"3810945e-7192-4204-a89e-5fad8e22c611","Type":"ContainerStarted","Data":"7f945e465ad96a014999679120084cc34454dd208235ee1c5ca999b329168492"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.249960 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" event={"ID":"fc53c080-c843-4682-a5b2-dd571b28000c","Type":"ContainerStarted","Data":"7c49431ad588e318278566f12176de9db741e21c0859a54f97428de0f0bc8be3"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.250521 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5"] Dec 01 10:01:50 crc kubenswrapper[4958]: W1201 10:01:50.254120 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee9be51_cd77_41e9_9db3_ab6a64015288.slice/crio-56d837e17e564ad76df7142bc82fce4a816f0817c1076710275c3985ad7c34e9 WatchSource:0}: Error finding container 56d837e17e564ad76df7142bc82fce4a816f0817c1076710275c3985ad7c34e9: Status 404 returned error can't find the container with id 56d837e17e564ad76df7142bc82fce4a816f0817c1076710275c3985ad7c34e9 Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.255203 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" event={"ID":"416bbcbb-fc9f-489b-a1ac-eea205ce9152","Type":"ContainerStarted","Data":"62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.256379 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.262251 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" event={"ID":"d27f831c-bcde-4937-b3c6-0ada6053a268","Type":"ContainerStarted","Data":"d07fd43a26fa0caaf62138e8d9ca8987c240f55963cfa35b6fe89a1e96638ce5"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.267781 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.285388 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" event={"ID":"62f08690-7b52-4f87-a32d-e33b10590cdc","Type":"ContainerStarted","Data":"98a6285496769d378eaee3d520cde64513cbd59c858bd95b2b4153d76257f691"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.315590 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" podStartSLOduration=133.315561579 podStartE2EDuration="2m13.315561579s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:50.314149655 +0000 UTC m=+157.822938692" watchObservedRunningTime="2025-12-01 10:01:50.315561579 +0000 UTC m=+157.824350626" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.319612 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.320587 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.820565525 +0000 UTC m=+158.329354562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.361090 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" event={"ID":"ea105b82-fec3-4f5d-b056-fdd566619645","Type":"ContainerStarted","Data":"6733dc635566afd9330b71b35c9756f01b0294c6e995d7a2bba7e6c68f0437cd"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.361738 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:50 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:50 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:50 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.361770 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.375621 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" event={"ID":"db83732a-f1a2-4daa-8bf9-89fa7eebec3d","Type":"ContainerStarted","Data":"37357f5257f2be65c51a3704734e2b891888c6c55dc99d1003ce6dc5241ff0a0"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.396545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" event={"ID":"c317e220-5d51-4a28-91c9-93ef65b36e92","Type":"ContainerStarted","Data":"ddbaf592c1e3b872aadcae0d5f7ec39c20f20ec5ae358d2a2b85087c2f94751a"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.396600 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" event={"ID":"c317e220-5d51-4a28-91c9-93ef65b36e92","Type":"ContainerStarted","Data":"4660f85ec7b917f8d815009b23b7f7c9a2c6c41bbf7c89125d8f07110a52796b"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.403014 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.412467 4958 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5t5rz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:5443/healthz\": dial tcp 10.217.0.7:5443: connect: connection refused" start-of-body= Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.412559 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" podUID="c317e220-5d51-4a28-91c9-93ef65b36e92" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.7:5443/healthz\": dial tcp 10.217.0.7:5443: connect: connection refused" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.415876 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" event={"ID":"ca27d3de-fa2e-4aa7-aef8-b9373742bddb","Type":"ContainerStarted","Data":"f44163dec18b6b891ed9570e51fbf334b35219fc8f6cf4acbf6417baeec169d6"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.430925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.440666 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:50.940643196 +0000 UTC m=+158.449432234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.455520 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" event={"ID":"a28acca5-cb70-46d6-a5a4-49f9dbc79c91","Type":"ContainerStarted","Data":"a7f6fbe73fe92034cfb20bbbc888424d25793eea994186c393ee88c677517b9f"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.515776 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-g9fjl" event={"ID":"81ffe673-ed64-4dd2-81c3-65c922af39bc","Type":"ContainerStarted","Data":"1ebc96e5aa0ddfa4c137b81d9e100ea43cf00886e916ce56acc0a0a9ab18e639"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.543991 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c"] Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.553295 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.557937 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.057810327 +0000 UTC m=+158.566599364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.559284 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.559818 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.059791969 +0000 UTC m=+158.568581006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.570581 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" event={"ID":"c12352eb-31e7-465d-b743-9d3ee6093b97","Type":"ContainerStarted","Data":"8e72c1bc0b467e6e72a94ee4f8bbcd726cede74463ce522157fc92c7a7c1ef69"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.575928 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-d2tvd"] Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.605242 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" event={"ID":"eaa4fac0-c895-4779-9eb9-11724f97ccd8","Type":"ContainerStarted","Data":"81342464b8f0f7432b90e4d4a0e398a1971c4a8559c7e630374fe80092ccf536"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.638213 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" event={"ID":"c7fc63ed-b5b8-4854-8d5a-1cab817a3581","Type":"ContainerStarted","Data":"595564ba76d271e59104406ee60112e98bea18a7bd5aa94af7d5df1883f08f4e"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.680387 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.681021 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.181001585 +0000 UTC m=+158.689790622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.700349 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" event={"ID":"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5","Type":"ContainerStarted","Data":"d7cfb04ad6b52a0a3dedab4b6d2f5e2924878825e183a18da0cc855e33c222f1"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.700414 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" event={"ID":"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5","Type":"ContainerStarted","Data":"62c0dbb9c79008b679605f87827b255f0cc670f4e9228110bf6d48f8be42b433"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.722340 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6q7kk"] Dec 01 10:01:50 crc kubenswrapper[4958]: W1201 10:01:50.742944 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod954a15b1_bcae_44b9_ad5b_e004ed4d79be.slice/crio-96ecb5dc151435086a7c80cdf76057000cbf33fea44791891921623e4048f3ef WatchSource:0}: Error finding container 96ecb5dc151435086a7c80cdf76057000cbf33fea44791891921623e4048f3ef: Status 404 returned error can't find the container with id 96ecb5dc151435086a7c80cdf76057000cbf33fea44791891921623e4048f3ef Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.783317 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-spstc" event={"ID":"70ab8e46-495c-4e0e-8896-e4eac8d855b6","Type":"ContainerStarted","Data":"d4f69ead1e48d6ea108b4471321a38cd549b9b221852b194340522d9c0d51e98"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.783409 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-spstc" event={"ID":"70ab8e46-495c-4e0e-8896-e4eac8d855b6","Type":"ContainerStarted","Data":"c30622e95633ebfbb546a23486fd147581662005abf589824428bf1982b38ab9"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.788781 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng"] Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.790345 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.790743 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.290729124 +0000 UTC m=+158.799518161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.825755 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" event={"ID":"7a98149e-16f6-4129-a488-393bdbf90da3","Type":"ContainerStarted","Data":"9296e0545318b624bb6422e34f2609e235dd731e4bf44c70fd4cbf5878b86e8c"} Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.856083 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9g4m7" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.863623 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hpjfh" podStartSLOduration=132.863542332 podStartE2EDuration="2m12.863542332s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:50.822125972 +0000 UTC m=+158.330915009" watchObservedRunningTime="2025-12-01 10:01:50.863542332 +0000 UTC m=+158.372331369" Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.893786 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx"] Dec 01 10:01:50 crc kubenswrapper[4958]: W1201 10:01:50.902344 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6723ac99_b14b_4d99_90fa_9dd395093f3b.slice/crio-1bc23ac5ade5136738415ef2a2de8a4af3616ec777e6ec28bc14994235df59bf WatchSource:0}: Error finding container 1bc23ac5ade5136738415ef2a2de8a4af3616ec777e6ec28bc14994235df59bf: Status 404 returned error can't find the container with id 1bc23ac5ade5136738415ef2a2de8a4af3616ec777e6ec28bc14994235df59bf Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.902801 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.903414 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.403383094 +0000 UTC m=+158.912172131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.903718 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:50 crc kubenswrapper[4958]: E1201 10:01:50.906087 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.406071587 +0000 UTC m=+158.914860624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:50 crc kubenswrapper[4958]: I1201 10:01:50.994131 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.007686 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.008453 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.508432877 +0000 UTC m=+159.017221914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.055823 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-spstc" podStartSLOduration=133.055782932 podStartE2EDuration="2m13.055782932s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:51.022879487 +0000 UTC m=+158.531668524" watchObservedRunningTime="2025-12-01 10:01:51.055782932 +0000 UTC m=+158.564571979" Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.072246 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.082757 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wxxsg" podStartSLOduration=133.082730431 podStartE2EDuration="2m13.082730431s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:51.079699397 +0000 UTC m=+158.588488444" watchObservedRunningTime="2025-12-01 10:01:51.082730431 +0000 UTC m=+158.591519468" Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.096379 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9f7tt"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.110530 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.111019 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.611002432 +0000 UTC m=+159.119791469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.117015 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" podStartSLOduration=133.116985119 podStartE2EDuration="2m13.116985119s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:51.107048659 +0000 UTC m=+158.615837716" watchObservedRunningTime="2025-12-01 10:01:51.116985119 +0000 UTC m=+158.625774156" Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.142559 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" podStartSLOduration=134.142524004 podStartE2EDuration="2m14.142524004s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:51.131753699 +0000 UTC m=+158.640542736" watchObservedRunningTime="2025-12-01 10:01:51.142524004 +0000 UTC m=+158.651313041" Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.142948 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.187542 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bglks"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.213432 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.214614 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.714582909 +0000 UTC m=+159.223371946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.316779 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4khgc"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.318599 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.319428 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.819409585 +0000 UTC m=+159.328198622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.346105 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:51 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:51 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:51 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.346203 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:51 crc kubenswrapper[4958]: W1201 10:01:51.365895 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b89d3d_56c3_4092_8697_c8bd7ecbad7b.slice/crio-723ad5e3ad27039fe5e593fc3fb1c8dfd2b18483fa12be9e7952e7584252c35b WatchSource:0}: Error finding container 723ad5e3ad27039fe5e593fc3fb1c8dfd2b18483fa12be9e7952e7584252c35b: Status 404 returned error can't find the container with id 723ad5e3ad27039fe5e593fc3fb1c8dfd2b18483fa12be9e7952e7584252c35b Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.374495 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rhcmn"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.388682 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pxgcs"] Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.419828 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.419981 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.919953678 +0000 UTC m=+159.428742715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.422433 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.423043 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:51.923029374 +0000 UTC m=+159.431818411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: W1201 10:01:51.454712 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116691d6_7237_438e_8adf_2678cbcbced0.slice/crio-f78483b4f36641f9c67f0446a84460b59c30f8dd7d8853f04faf75d15051ab6a WatchSource:0}: Error finding container f78483b4f36641f9c67f0446a84460b59c30f8dd7d8853f04faf75d15051ab6a: Status 404 returned error can't find the container with id f78483b4f36641f9c67f0446a84460b59c30f8dd7d8853f04faf75d15051ab6a Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.524528 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.524745 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.024712122 +0000 UTC m=+159.533501159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.524898 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.525263 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.025248669 +0000 UTC m=+159.534037706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.626110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.626377 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.126335478 +0000 UTC m=+159.635124525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.626958 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.627333 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.127317268 +0000 UTC m=+159.636106305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.727838 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.728445 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.228426789 +0000 UTC m=+159.737215826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.831180 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.831607 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.331592213 +0000 UTC m=+159.840381240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.854794 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" event={"ID":"62649f01-3312-4309-9c30-05090c2655eb","Type":"ContainerStarted","Data":"9a355feb6211061d8eabbb3d32e2a098a5cf4dfbd4acb8e1e4c2f1eaebbca927"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.876906 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" event={"ID":"3810945e-7192-4204-a89e-5fad8e22c611","Type":"ContainerStarted","Data":"820067e03d262ae0d6d37d20b0379f2f1eb5a3067d8f35dad2b319932cca2116"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.882020 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" event={"ID":"9ed7b0bb-41a7-49dc-93a6-292040efb53b","Type":"ContainerStarted","Data":"91e8db4f002c28dd2436bef46b260c19b4d9b0070affeec2203509b5effc9e51"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.895891 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" event={"ID":"db83732a-f1a2-4daa-8bf9-89fa7eebec3d","Type":"ContainerStarted","Data":"00818dc56369238d7e04882d03aa5d47ff112e620ba671c9d95c9a134d223ec2"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.900640 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" event={"ID":"90ee9175-d058-4b89-81c0-d7619908a777","Type":"ContainerStarted","Data":"1af041aca7dd74dbaaa5a5b5b06f620f464cc944904b62cec3c3b5c9edab8961"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.904723 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" event={"ID":"b6f17185-a290-4d91-9337-cd0954cf9c66","Type":"ContainerStarted","Data":"47996883fee8d6dfeec493de8b4df6e882d2dc4e81d0ac71f5146bafdb593bbc"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.912785 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rhcmn" event={"ID":"116691d6-7237-438e-8adf-2678cbcbced0","Type":"ContainerStarted","Data":"f78483b4f36641f9c67f0446a84460b59c30f8dd7d8853f04faf75d15051ab6a"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.933891 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zppvz" podStartSLOduration=133.933864839 podStartE2EDuration="2m13.933864839s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:51.926066886 +0000 UTC m=+159.434855923" watchObservedRunningTime="2025-12-01 10:01:51.933864839 +0000 UTC m=+159.442653876" Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.935926 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.936010 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.435989295 +0000 UTC m=+159.944778332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.938939 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:51 crc kubenswrapper[4958]: E1201 10:01:51.940377 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.440360232 +0000 UTC m=+159.949149269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.940538 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" event={"ID":"152c0d38-b359-479b-bfd6-cb019847d7fa","Type":"ContainerStarted","Data":"248336217ce32239d0b2c3ea819f76b365fcf5dace6082a1a9c4ea582c84fc11"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.956671 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mzqdq" podStartSLOduration=133.956643989 podStartE2EDuration="2m13.956643989s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:51.953826181 +0000 UTC m=+159.462615218" watchObservedRunningTime="2025-12-01 10:01:51.956643989 +0000 UTC m=+159.465433026" Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.971589 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" event={"ID":"fc711edd-1026-4f47-ab7f-92bd6e1fb964","Type":"ContainerStarted","Data":"19d856b604097473be43874de21bb8d3cdb41bbad7f12e995774e2eb81410177"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.988240 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" event={"ID":"5418dd0e-243d-4573-94c9-a1948dca9b9f","Type":"ContainerStarted","Data":"f4da90749763627a0f0fa86730ba9dac2ad6d1b8537fe848e20577c9b8181b55"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.988297 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" event={"ID":"5418dd0e-243d-4573-94c9-a1948dca9b9f","Type":"ContainerStarted","Data":"60792b5f17a4e0073596bc8a63bea84f7dfdd64688410fa73149858b6e4386e2"} Dec 01 10:01:51 crc kubenswrapper[4958]: I1201 10:01:51.990025 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:51.995879 4958 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fmdxw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:51.995951 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" podUID="5418dd0e-243d-4573-94c9-a1948dca9b9f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.022370 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" event={"ID":"90fd93e3-8621-438c-9914-fe4a1be62b8d","Type":"ContainerStarted","Data":"07c19b1bd3418a72755b4f778db48a96663036100e91606dd53e1d07ef4552bd"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.036071 4958 generic.go:334] "Generic (PLEG): container finished" podID="a28acca5-cb70-46d6-a5a4-49f9dbc79c91" containerID="7226e3e13b14d053ea983164e805481c0fb681a7ce76cb977a849c5d0d3af71d" exitCode=0 Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.036175 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" event={"ID":"a28acca5-cb70-46d6-a5a4-49f9dbc79c91","Type":"ContainerDied","Data":"7226e3e13b14d053ea983164e805481c0fb681a7ce76cb977a849c5d0d3af71d"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.040836 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.041099 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.541057469 +0000 UTC m=+160.049846516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.041441 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.041864 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.541830683 +0000 UTC m=+160.050619730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.072567 4958 generic.go:334] "Generic (PLEG): container finished" podID="d27f831c-bcde-4937-b3c6-0ada6053a268" containerID="9265ba7c73f9b64640272b6ca3a9c7be74001e007fbbf21dc971048258829693" exitCode=0 Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.072663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" event={"ID":"d27f831c-bcde-4937-b3c6-0ada6053a268","Type":"ContainerDied","Data":"9265ba7c73f9b64640272b6ca3a9c7be74001e007fbbf21dc971048258829693"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.096257 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-g9fjl" event={"ID":"81ffe673-ed64-4dd2-81c3-65c922af39bc","Type":"ContainerStarted","Data":"0135f3300f4f07f811445240f4cc2c1488620fc4c24df50c244748f6d2e519f3"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.097627 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.099821 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.099894 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.112246 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" podStartSLOduration=134.112218416 podStartE2EDuration="2m14.112218416s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.021456798 +0000 UTC m=+159.530245855" watchObservedRunningTime="2025-12-01 10:01:52.112218416 +0000 UTC m=+159.621007453" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.130497 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" event={"ID":"1532650f-e94e-4f91-b79c-a0da8e727893","Type":"ContainerStarted","Data":"b525224bcfc6294b81d205d55ff3edf37ff1b01d8bb801fa5889de1cd860e704"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.132292 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" event={"ID":"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b","Type":"ContainerStarted","Data":"723ad5e3ad27039fe5e593fc3fb1c8dfd2b18483fa12be9e7952e7584252c35b"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.142793 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.148689 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.648661511 +0000 UTC m=+160.157450558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.162416 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" event={"ID":"478689fc-5c07-45bc-ab87-8c58e68c348b","Type":"ContainerStarted","Data":"80606ffb7572ecc1674726a52016096f571e96ee51d3a98e829c72e19d545927"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.183724 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" event={"ID":"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8","Type":"ContainerStarted","Data":"2b6cec057ee471dc0db03f04ec5c8ed18d9201f15ce29bfcf879b4c20c91f663"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.188327 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" event={"ID":"c12352eb-31e7-465d-b743-9d3ee6093b97","Type":"ContainerStarted","Data":"60354850a535eae8346f61499364eddab78f54c820a8cc9a2dec72662873bd0f"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.208859 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-g9fjl" podStartSLOduration=135.208812635 podStartE2EDuration="2m15.208812635s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.148345571 +0000 UTC m=+159.657134608" watchObservedRunningTime="2025-12-01 10:01:52.208812635 +0000 UTC m=+159.717601672" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.244085 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" event={"ID":"fc53c080-c843-4682-a5b2-dd571b28000c","Type":"ContainerStarted","Data":"a0ed0c6e586ae5607645e41c5d6d658c388b33184e5b295f79a759ab199f5ca0"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.246469 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.252157 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.752132644 +0000 UTC m=+160.260921681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.267860 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxrcb" event={"ID":"9ee9be51-cd77-41e9-9db3-ab6a64015288","Type":"ContainerStarted","Data":"f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.267933 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxrcb" event={"ID":"9ee9be51-cd77-41e9-9db3-ab6a64015288","Type":"ContainerStarted","Data":"56d837e17e564ad76df7142bc82fce4a816f0817c1076710275c3985ad7c34e9"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.276771 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-txbzn" podStartSLOduration=135.276742891 podStartE2EDuration="2m15.276742891s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.27637408 +0000 UTC m=+159.785163127" watchObservedRunningTime="2025-12-01 10:01:52.276742891 +0000 UTC m=+159.785531928" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.289313 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" event={"ID":"142b2c22-6df3-4540-8ad7-748e000f3ec9","Type":"ContainerStarted","Data":"5e1ae2eac3438b0fa0276b7ffb18307cd4beec8dbbe1dc619f4198e05c2490b1"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.311283 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" event={"ID":"6723ac99-b14b-4d99-90fa-9dd395093f3b","Type":"ContainerStarted","Data":"1bc23ac5ade5136738415ef2a2de8a4af3616ec777e6ec28bc14994235df59bf"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.353276 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.353547 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:52 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:52 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:52 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.354808 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.354859 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" event={"ID":"ea105b82-fec3-4f5d-b056-fdd566619645","Type":"ContainerStarted","Data":"5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.354899 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.354748 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:52.854730331 +0000 UTC m=+160.363519378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.365445 4958 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-46smf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.365539 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" podUID="ea105b82-fec3-4f5d-b056-fdd566619645" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.498160 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nrv4l" podStartSLOduration=134.498117458 podStartE2EDuration="2m14.498117458s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.367619802 +0000 UTC m=+159.876408839" watchObservedRunningTime="2025-12-01 10:01:52.498117458 +0000 UTC m=+160.006906495" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.500092 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rxrcb" podStartSLOduration=135.500081319 podStartE2EDuration="2m15.500081319s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.484671399 +0000 UTC m=+159.993460436" watchObservedRunningTime="2025-12-01 10:01:52.500081319 +0000 UTC m=+160.008870366" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.518930 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.519943 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.019916397 +0000 UTC m=+160.528705434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.527261 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" event={"ID":"7a98149e-16f6-4129-a488-393bdbf90da3","Type":"ContainerStarted","Data":"6acab90fe3fd3be8c322d1c70d669fc6c1f5ded2288cf7bd0a79d86734cc2901"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.554787 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" event={"ID":"1fd46f3c-80b3-4881-ae13-263db886a0e6","Type":"ContainerStarted","Data":"a5f60dc3d8d59a70be0eb8679e9c4b588a5abf87c693d8986ef4696946d71b27"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.584631 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4khgc" event={"ID":"1dcea1f2-dae7-4542-99bd-22275ec83616","Type":"ContainerStarted","Data":"993dc823196f88a97ebd30fa5f2e90555cf3006d75bf630e4dc3b3ec264326ea"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.620224 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.620254 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" event={"ID":"c7fc63ed-b5b8-4854-8d5a-1cab817a3581","Type":"ContainerStarted","Data":"33bfaa112e49e36444c1dc96be1c6331994d3451ba06e7841fff3cda174709ee"} Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.621161 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.121138801 +0000 UTC m=+160.629927828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.652563 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" event={"ID":"62f08690-7b52-4f87-a32d-e33b10590cdc","Type":"ContainerStarted","Data":"6fb0804e0d56a8ae20ce8cbbd6f63e6d7a2c71a4729ac049f2941eedd6f7e709"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.714011 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4z6qs" event={"ID":"2881ba7f-a7a8-454b-a6e8-071573585183","Type":"ContainerStarted","Data":"e03c05798046566589a981f4a4910e25b5b4905748dbc5f749e55de0ed6869d0"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.714091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4z6qs" event={"ID":"2881ba7f-a7a8-454b-a6e8-071573585183","Type":"ContainerStarted","Data":"a75649f33acfaf2843b84aac5d38b9729dde581190a885db02285fe427ea4234"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.729062 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.729810 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" podStartSLOduration=134.729793526 podStartE2EDuration="2m14.729793526s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.714342065 +0000 UTC m=+160.223131102" watchObservedRunningTime="2025-12-01 10:01:52.729793526 +0000 UTC m=+160.238582563" Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.729950 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.229935241 +0000 UTC m=+160.738724278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.815720 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" event={"ID":"eaa4fac0-c895-4779-9eb9-11724f97ccd8","Type":"ContainerStarted","Data":"c0948f548346f1710de333628025903b5cc5a1d57fd92a3482dd8eebb2d08f45"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.838245 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.838478 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.338441121 +0000 UTC m=+160.847230158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.838607 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.840530 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.340510336 +0000 UTC m=+160.849299373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.855373 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sv8bf" podStartSLOduration=134.855348338 podStartE2EDuration="2m14.855348338s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.854404598 +0000 UTC m=+160.363193635" watchObservedRunningTime="2025-12-01 10:01:52.855348338 +0000 UTC m=+160.364137545" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.857563 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-tlnwn" podStartSLOduration=135.857553966 podStartE2EDuration="2m15.857553966s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.789270279 +0000 UTC m=+160.298059316" watchObservedRunningTime="2025-12-01 10:01:52.857553966 +0000 UTC m=+160.366343003" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.865407 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" event={"ID":"954a15b1-bcae-44b9-ad5b-e004ed4d79be","Type":"ContainerStarted","Data":"96ecb5dc151435086a7c80cdf76057000cbf33fea44791891921623e4048f3ef"} Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.909522 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4z6qs" podStartSLOduration=6.909475784 podStartE2EDuration="6.909475784s" podCreationTimestamp="2025-12-01 10:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.896635884 +0000 UTC m=+160.405424921" watchObservedRunningTime="2025-12-01 10:01:52.909475784 +0000 UTC m=+160.418264851" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.911392 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5t5rz" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.924768 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nzgvd" podStartSLOduration=134.92474069 podStartE2EDuration="2m14.92474069s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.921832109 +0000 UTC m=+160.430621136" watchObservedRunningTime="2025-12-01 10:01:52.92474069 +0000 UTC m=+160.433529727" Dec 01 10:01:52 crc kubenswrapper[4958]: I1201 10:01:52.942522 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:52 crc kubenswrapper[4958]: E1201 10:01:52.944047 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.44402798 +0000 UTC m=+160.952817017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.016769 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" podStartSLOduration=113.016741456 podStartE2EDuration="1m53.016741456s" podCreationTimestamp="2025-12-01 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:52.962032081 +0000 UTC m=+160.470821118" watchObservedRunningTime="2025-12-01 10:01:53.016741456 +0000 UTC m=+160.525530493" Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.046961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.055651 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.555535555 +0000 UTC m=+161.064324592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.148713 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.149454 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.64943619 +0000 UTC m=+161.158225227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.251968 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.252365 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.752346686 +0000 UTC m=+161.261135723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.347792 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:53 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:53 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:53 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.347890 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.354309 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.854284602 +0000 UTC m=+161.363073639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.354184 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.355263 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.355832 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.8558164 +0000 UTC m=+161.364605437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.457672 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.458531 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:53.95851427 +0000 UTC m=+161.467303297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.560098 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.560634 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.060617581 +0000 UTC m=+161.569406628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.661104 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.661683 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.161662659 +0000 UTC m=+161.670451706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.763591 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.764496 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.264477502 +0000 UTC m=+161.773266539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.874962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.875880 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.375831831 +0000 UTC m=+161.884620868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: E1201 10:01:53.985128 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.485111816 +0000 UTC m=+161.993900853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:53 crc kubenswrapper[4958]: I1201 10:01:53.984673 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.025703 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" event={"ID":"954a15b1-bcae-44b9-ad5b-e004ed4d79be","Type":"ContainerStarted","Data":"31900a7ee5f6e78fb0673cec346e8eee4d3a131eb91794d86e5fd79aa549e126"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.029747 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4khgc" event={"ID":"1dcea1f2-dae7-4542-99bd-22275ec83616","Type":"ContainerStarted","Data":"8e7299cfce838e213242f91c35b286f8c65c8c727ede1757495d7b48903be712"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.033095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" event={"ID":"90fd93e3-8621-438c-9914-fe4a1be62b8d","Type":"ContainerStarted","Data":"37834f53dfe7cfa15c0db331de0c9f34619e25f9992bf2b3da774347080ff6e1"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.034385 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.047747 4958 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xnmzx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.047864 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" podUID="90fd93e3-8621-438c-9914-fe4a1be62b8d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.081715 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" event={"ID":"62649f01-3312-4309-9c30-05090c2655eb","Type":"ContainerStarted","Data":"5a962d7ef1c460aabaabe20867e794f2312c8b34d6759515174d3a468a21f2b8"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.084320 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" event={"ID":"fc53c080-c843-4682-a5b2-dd571b28000c","Type":"ContainerStarted","Data":"b2f8e344381c34cbaf32db2fcd64adf86d66313a2c4e826e635d18998e17df75"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.085980 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.086547 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.586527595 +0000 UTC m=+162.095316622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.092248 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" event={"ID":"7a98149e-16f6-4129-a488-393bdbf90da3","Type":"ContainerStarted","Data":"86be80e3b5a0d50aab5301b28607d13e4216a263c1524aaa5cfb4d59f5411718"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.099718 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" event={"ID":"fc711edd-1026-4f47-ab7f-92bd6e1fb964","Type":"ContainerStarted","Data":"70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.100633 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.107786 4958 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6q7kk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.107892 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.108904 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" event={"ID":"b6f17185-a290-4d91-9337-cd0954cf9c66","Type":"ContainerStarted","Data":"62fdd42ab0ffb11bf311551b7212274f306e4950cb82395ea2467359909876f2"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.125271 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" event={"ID":"d27f831c-bcde-4937-b3c6-0ada6053a268","Type":"ContainerStarted","Data":"b9c8fb096196b13ef5cfe00d6b2f9351bd4da4fd65bd3cbd2772c5fbd78ee4f8"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.139366 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" event={"ID":"1fd46f3c-80b3-4881-ae13-263db886a0e6","Type":"ContainerStarted","Data":"6325f606c4957b30de959a894106a910816b4da1181fc665dbadd551aafd6472"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.150306 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" event={"ID":"142b2c22-6df3-4540-8ad7-748e000f3ec9","Type":"ContainerStarted","Data":"9fe59bcd10b0276abf6ab41d2941b859530159f323399b6106431d49950ec1f5"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.150372 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" event={"ID":"142b2c22-6df3-4540-8ad7-748e000f3ec9","Type":"ContainerStarted","Data":"1335e4383a8f80303fb261a586de4de8b577217d0c8638cc36f8a211d4a56e2c"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.191218 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" event={"ID":"a28acca5-cb70-46d6-a5a4-49f9dbc79c91","Type":"ContainerStarted","Data":"5797d0369d5e26d10993312d450fbbf03ef3ca710b903b0a497d057ae22bd3ba"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.195728 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.199065 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.699042311 +0000 UTC m=+162.207831548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.210524 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" event={"ID":"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b","Type":"ContainerStarted","Data":"84c670b4ed5134545354093991f0bfc8e786cbd6aaf6790fef5fd59b7f13dfa5"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.211233 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" event={"ID":"c0b89d3d-56c3-4092-8697-c8bd7ecbad7b","Type":"ContainerStarted","Data":"785839024fed2c151d0c8cb0e5573acd8ab637e1fa8a2209c20a25155f6a8d11"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.229807 4958 generic.go:334] "Generic (PLEG): container finished" podID="f01bcd7f-0920-48a2-8b61-5e7eca60f0a8" containerID="0781a21f06ea6f6534a04255b0e552c77aa752e2b1bd41d0cc364cd21b15d14a" exitCode=0 Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.229924 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" event={"ID":"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8","Type":"ContainerDied","Data":"0781a21f06ea6f6534a04255b0e552c77aa752e2b1bd41d0cc364cd21b15d14a"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.254177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rhcmn" event={"ID":"116691d6-7237-438e-8adf-2678cbcbced0","Type":"ContainerStarted","Data":"d59c43159129703e45538ba1ce926ddba824d3d392604ad805b5668232197cad"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.256735 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" event={"ID":"ca27d3de-fa2e-4aa7-aef8-b9373742bddb","Type":"ContainerStarted","Data":"890308e89b42ab3917685ae65cc518f8906148bec81aa14a4ef05947a126481a"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.282130 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" event={"ID":"152c0d38-b359-479b-bfd6-cb019847d7fa","Type":"ContainerStarted","Data":"4d32b8b33b1512e138c3b2d664f5b083f008cd51776220a4b84bc1b854b0c0a3"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.309992 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.311448 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.811401032 +0000 UTC m=+162.320190069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.342461 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" event={"ID":"9ed7b0bb-41a7-49dc-93a6-292040efb53b","Type":"ContainerStarted","Data":"83fcd2c83619f21cffec5e62e7ef6c7e07bef71f0a850b9cd8cbb3d1567c408f"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.347178 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:54 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:54 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:54 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.347627 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.354502 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" event={"ID":"478689fc-5c07-45bc-ab87-8c58e68c348b","Type":"ContainerStarted","Data":"2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.355711 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.363110 4958 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9f7tt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.363196 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" podUID="478689fc-5c07-45bc-ab87-8c58e68c348b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.405890 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" event={"ID":"6723ac99-b14b-4d99-90fa-9dd395093f3b","Type":"ContainerStarted","Data":"f50327fe52dff8755061857895e0ad197d3421b92c445728e6039f8c4607082e"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.406343 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.412372 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.414574 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:54.914559336 +0000 UTC m=+162.423348373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.435239 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" event={"ID":"c42a0759-cde0-4ecd-b99b-d6dcb74be0c5","Type":"ContainerStarted","Data":"57beb1752e0d6d2535e62ab48672570f2b0845bb3819bb586a7257dcde88d777"} Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.437499 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.437736 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.445674 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.452553 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fmdxw" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.513502 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.513760 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.013718785 +0000 UTC m=+162.522507832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.514377 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.517011 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.016990627 +0000 UTC m=+162.525779664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.574775 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.575807 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.584521 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.584736 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.613362 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.635552 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.637149 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.137090609 +0000 UTC m=+162.645879646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.637619 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.643636 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.643719 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.692612 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.192584138 +0000 UTC m=+162.701373185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.725706 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-msxng" podStartSLOduration=136.725649509 podStartE2EDuration="2m16.725649509s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:54.694296502 +0000 UTC m=+162.203085539" watchObservedRunningTime="2025-12-01 10:01:54.725649509 +0000 UTC m=+162.234438546" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.758665 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-d2tvd" podStartSLOduration=136.758640216 podStartE2EDuration="2m16.758640216s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:54.758246554 +0000 UTC m=+162.267035591" watchObservedRunningTime="2025-12-01 10:01:54.758640216 +0000 UTC m=+162.267429253" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.805554 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.806191 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.806306 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.806586 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.806753 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.306714834 +0000 UTC m=+162.815503871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.858551 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.911465 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:54 crc kubenswrapper[4958]: E1201 10:01:54.911984 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.411960033 +0000 UTC m=+162.920749070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.933298 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.950329 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8wnxg" podStartSLOduration=136.950290898 podStartE2EDuration="2m16.950290898s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:54.871900725 +0000 UTC m=+162.380689762" watchObservedRunningTime="2025-12-01 10:01:54.950290898 +0000 UTC m=+162.459079935" Dec 01 10:01:54 crc kubenswrapper[4958]: I1201 10:01:54.952730 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" podStartSLOduration=136.952718473 podStartE2EDuration="2m16.952718473s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:54.932672379 +0000 UTC m=+162.441461416" watchObservedRunningTime="2025-12-01 10:01:54.952718473 +0000 UTC m=+162.461507510" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.014106 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.014378 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.514338803 +0000 UTC m=+163.023127850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.014585 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.015047 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.515030355 +0000 UTC m=+163.023819392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.070865 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bbnnf" podStartSLOduration=137.070810523 podStartE2EDuration="2m17.070810523s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.068246313 +0000 UTC m=+162.577035350" watchObservedRunningTime="2025-12-01 10:01:55.070810523 +0000 UTC m=+162.579599560" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.116706 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.117061 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.616996601 +0000 UTC m=+163.125785638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.117162 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.117751 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.617716374 +0000 UTC m=+163.126505591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.137514 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" podStartSLOduration=138.137472949 podStartE2EDuration="2m18.137472949s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.136647724 +0000 UTC m=+162.645436761" watchObservedRunningTime="2025-12-01 10:01:55.137472949 +0000 UTC m=+162.646261986" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.218975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.219155 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.719117133 +0000 UTC m=+163.227906170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.219834 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.220637 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.720605989 +0000 UTC m=+163.229395026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.252333 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-64ctn" podStartSLOduration=137.252295887 podStartE2EDuration="2m17.252295887s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.238211358 +0000 UTC m=+162.747000395" watchObservedRunningTime="2025-12-01 10:01:55.252295887 +0000 UTC m=+162.761084924" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.318947 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jfg5l" podStartSLOduration=137.318921472 podStartE2EDuration="2m17.318921472s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.318545151 +0000 UTC m=+162.827334188" watchObservedRunningTime="2025-12-01 10:01:55.318921472 +0000 UTC m=+162.827710509" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.321505 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.321992 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.821966587 +0000 UTC m=+163.330755614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.322105 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.322912 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.822898616 +0000 UTC m=+163.331687833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.344328 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:55 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:55 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:55 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.344431 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.423722 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.424161 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:55.92413977 +0000 UTC m=+163.432928807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.461860 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xkxmd" event={"ID":"1fd46f3c-80b3-4881-ae13-263db886a0e6","Type":"ContainerStarted","Data":"eb39caa3f9004958382fc9e11fedee163b01b6c8863607da83f8ae4c6c80e645"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.496276 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" event={"ID":"6723ac99-b14b-4d99-90fa-9dd395093f3b","Type":"ContainerStarted","Data":"4059e35be05686a1e134a3d2a6cde7e95d9f702319931306326d76d6e33c76aa"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.502667 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" event={"ID":"f01bcd7f-0920-48a2-8b61-5e7eca60f0a8","Type":"ContainerStarted","Data":"f5c971052a025d9f58fb5ee9da6cbb2c38f77b510eefe57e3d6b7e2ffbca6f9a"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.502721 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.512461 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4khgc" event={"ID":"1dcea1f2-dae7-4542-99bd-22275ec83616","Type":"ContainerStarted","Data":"ed170c1d262c23484132f7f9299e923d77f57a6e1dc1836fe4b161c975047764"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.513239 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4khgc" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.520387 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" event={"ID":"90ee9175-d058-4b89-81c0-d7619908a777","Type":"ContainerStarted","Data":"929c6e2a46f5e7f51c58bf13f142cfa4ab4ed5e45138f54c362deac18107d316"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.523994 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" podStartSLOduration=137.523976311 podStartE2EDuration="2m17.523976311s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.499018803 +0000 UTC m=+163.007807840" watchObservedRunningTime="2025-12-01 10:01:55.523976311 +0000 UTC m=+163.032765338" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.525107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.525471 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.025457597 +0000 UTC m=+163.534246634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.548775 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" event={"ID":"d27f831c-bcde-4937-b3c6-0ada6053a268","Type":"ContainerStarted","Data":"517b5aff3a37eea23fa4d9439c53d5701ca98521ca96696ceda38249f258d67f"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.605762 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" event={"ID":"62649f01-3312-4309-9c30-05090c2655eb","Type":"ContainerStarted","Data":"870884510a137fbb88e2dcea6d3f34e033d404a5008ea3b719fd890280d2e388"} Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.626566 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.628792 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.128774596 +0000 UTC m=+163.637563633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.632982 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.633058 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.636403 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.654891 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.656090 4958 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6q7kk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.656184 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.728615 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.741147 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.241126285 +0000 UTC m=+163.749915322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.840707 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.857137 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.357086578 +0000 UTC m=+163.865875605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.887299 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" podStartSLOduration=137.887257178 podStartE2EDuration="2m17.887257178s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.861816755 +0000 UTC m=+163.370605792" watchObservedRunningTime="2025-12-01 10:01:55.887257178 +0000 UTC m=+163.396046205" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.927120 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rhcmn" podStartSLOduration=9.927089289 podStartE2EDuration="9.927089289s" podCreationTimestamp="2025-12-01 10:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:55.916332174 +0000 UTC m=+163.425121211" watchObservedRunningTime="2025-12-01 10:01:55.927089289 +0000 UTC m=+163.435878326" Dec 01 10:01:55 crc kubenswrapper[4958]: I1201 10:01:55.951048 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:55 crc kubenswrapper[4958]: E1201 10:01:55.952676 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.452661246 +0000 UTC m=+163.961450283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.039773 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p6t9k" podStartSLOduration=138.039747929 podStartE2EDuration="2m18.039747929s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.00126435 +0000 UTC m=+163.510053387" watchObservedRunningTime="2025-12-01 10:01:56.039747929 +0000 UTC m=+163.548536966" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.041339 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-26pm4" podStartSLOduration=138.041328758 podStartE2EDuration="2m18.041328758s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.037943573 +0000 UTC m=+163.546732620" watchObservedRunningTime="2025-12-01 10:01:56.041328758 +0000 UTC m=+163.550117795" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.052908 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.053511 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.553488797 +0000 UTC m=+164.062277834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.107508 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" podStartSLOduration=138.107486369 podStartE2EDuration="2m18.107486369s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.103911378 +0000 UTC m=+163.612700415" watchObservedRunningTime="2025-12-01 10:01:56.107486369 +0000 UTC m=+163.616275406" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.138441 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.154777 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.155464 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.655446674 +0000 UTC m=+164.164235711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.166131 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwcn5" podStartSLOduration=138.166109966 podStartE2EDuration="2m18.166109966s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.165353262 +0000 UTC m=+163.674142299" watchObservedRunningTime="2025-12-01 10:01:56.166109966 +0000 UTC m=+163.674899003" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.256652 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.258967 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.758944028 +0000 UTC m=+164.267733065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.321234 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xnmzx" podStartSLOduration=138.321203148 podStartE2EDuration="2m18.321203148s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.27151292 +0000 UTC m=+163.780301947" watchObservedRunningTime="2025-12-01 10:01:56.321203148 +0000 UTC m=+163.829992185" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.342717 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:56 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:56 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:56 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.342814 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.364871 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" podStartSLOduration=139.364823527 podStartE2EDuration="2m19.364823527s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.322125847 +0000 UTC m=+163.830914884" watchObservedRunningTime="2025-12-01 10:01:56.364823527 +0000 UTC m=+163.873612564" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.365575 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.366081 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.866067426 +0000 UTC m=+164.374856463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.424944 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" podStartSLOduration=139.419816161 podStartE2EDuration="2m19.419816161s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.41817547 +0000 UTC m=+163.926964527" watchObservedRunningTime="2025-12-01 10:01:56.419816161 +0000 UTC m=+163.928605198" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.468504 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.468932 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:56.968881319 +0000 UTC m=+164.477670356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.557529 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4khgc" podStartSLOduration=10.55750415 podStartE2EDuration="10.55750415s" podCreationTimestamp="2025-12-01 10:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.523199321 +0000 UTC m=+164.031988348" watchObservedRunningTime="2025-12-01 10:01:56.55750415 +0000 UTC m=+164.066293187" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.557707 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqpvf" podStartSLOduration=139.557703256 podStartE2EDuration="2m19.557703256s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:56.554450805 +0000 UTC m=+164.063239852" watchObservedRunningTime="2025-12-01 10:01:56.557703256 +0000 UTC m=+164.066492293" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.570398 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.570924 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.070905748 +0000 UTC m=+164.579694785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.620384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb4c2ff-f340-4277-8fe6-28b87034faa4","Type":"ContainerStarted","Data":"108929bceb105f0cf14a86b9da1ae4bc1afc7503f823751550c211342ed75b3c"} Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.671858 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.675361 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.175324371 +0000 UTC m=+164.684113408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.774785 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.775904 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.275877663 +0000 UTC m=+164.784666700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.876341 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.877011 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.376987874 +0000 UTC m=+164.885776911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.915329 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:01:56 crc kubenswrapper[4958]: I1201 10:01:56.978545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:56 crc kubenswrapper[4958]: E1201 10:01:56.979139 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.479116385 +0000 UTC m=+164.987905442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.079537 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.079731 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.579702919 +0000 UTC m=+165.088491956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.079898 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.080344 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.580335759 +0000 UTC m=+165.089124796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.181606 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.181773 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.681747439 +0000 UTC m=+165.190536476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.181981 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.182398 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.682387469 +0000 UTC m=+165.191176506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.283259 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.283594 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.783555711 +0000 UTC m=+165.292344738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.283693 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.284176 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.784162519 +0000 UTC m=+165.292951546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.344516 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:57 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:57 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:57 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.344623 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.384481 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.384774 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.884714212 +0000 UTC m=+165.393503259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.385023 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.385476 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.885460916 +0000 UTC m=+165.394249953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.466176 4958 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.486874 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.487148 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.987107492 +0000 UTC m=+165.495896529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.487303 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.488007 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:57.987961439 +0000 UTC m=+165.496750476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.579477 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z5z9s"] Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.581217 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.589002 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.589207 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.089172082 +0000 UTC m=+165.597961119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.589364 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.589783 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.089775111 +0000 UTC m=+165.598564148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.590716 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.640214 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z5z9s"] Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.641427 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" event={"ID":"90ee9175-d058-4b89-81c0-d7619908a777","Type":"ContainerStarted","Data":"47a450262449545d6c5c613762d2475e5a217927eb80879e0e5434c8a167f3d3"} Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.641503 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" event={"ID":"90ee9175-d058-4b89-81c0-d7619908a777","Type":"ContainerStarted","Data":"df1493dbcf3e796dc0e741a9a365d99dbf51e863ebad8e735295608648e24575"} Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.644470 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb4c2ff-f340-4277-8fe6-28b87034faa4","Type":"ContainerStarted","Data":"77a8335defdad5390dee64d4d904a489be239ba71de5b6a65faaf677f9224b63"} Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.691536 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.691860 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.19179557 +0000 UTC m=+165.700584607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.691956 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-catalog-content\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.692140 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-utilities\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.692208 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.692317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5f8x\" (UniqueName: \"kubernetes.io/projected/144ccc0c-43b4-4146-a91a-b235916a6a0e-kube-api-access-c5f8x\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.693169 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.193133431 +0000 UTC m=+165.701922468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.794299 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.794670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-catalog-content\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.794743 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-utilities\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.794818 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5f8x\" (UniqueName: \"kubernetes.io/projected/144ccc0c-43b4-4146-a91a-b235916a6a0e-kube-api-access-c5f8x\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.795544 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.295524941 +0000 UTC m=+165.804313968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.797466 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-catalog-content\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.798523 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-utilities\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.825746 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9k5l5"] Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.827278 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.828806 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.828775827 podStartE2EDuration="3.828775827s" podCreationTimestamp="2025-12-01 10:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:57.808461065 +0000 UTC m=+165.317250112" watchObservedRunningTime="2025-12-01 10:01:57.828775827 +0000 UTC m=+165.337564864" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.836457 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5f8x\" (UniqueName: \"kubernetes.io/projected/144ccc0c-43b4-4146-a91a-b235916a6a0e-kube-api-access-c5f8x\") pod \"community-operators-z5z9s\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.852749 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.887205 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9k5l5"] Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.897077 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns986\" (UniqueName: \"kubernetes.io/projected/2b165b67-cdac-49bb-969f-aff516fa216d-kube-api-access-ns986\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.897176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-catalog-content\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.897232 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.897263 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-utilities\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:57 crc kubenswrapper[4958]: E1201 10:01:57.897644 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.397626503 +0000 UTC m=+165.906415540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.912614 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.971021 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqfvf"] Dec 01 10:01:57 crc kubenswrapper[4958]: I1201 10:01:57.972207 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:57.999564 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:57.999978 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-catalog-content\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.000127 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-utilities\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.000191 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns986\" (UniqueName: \"kubernetes.io/projected/2b165b67-cdac-49bb-969f-aff516fa216d-kube-api-access-ns986\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: E1201 10:01:58.000312 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.50025609 +0000 UTC m=+166.009045277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.000934 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-utilities\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.000986 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-catalog-content\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.063054 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns986\" (UniqueName: \"kubernetes.io/projected/2b165b67-cdac-49bb-969f-aff516fa216d-kube-api-access-ns986\") pod \"certified-operators-9k5l5\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.093985 4958 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T10:01:57.466219612Z","Handler":null,"Name":""} Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.105964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/40c8862e-a0c9-4bdc-9625-c5384c076732-kube-api-access-pvbpx\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.106109 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.106209 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-utilities\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.106281 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-catalog-content\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: E1201 10:01:58.106522 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 10:01:58.60649686 +0000 UTC m=+166.115285917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwjfj" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.128408 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqfvf"] Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.129716 4958 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.129779 4958 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.169963 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.182446 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mzbln"] Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.196353 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.213685 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzbln"] Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.214376 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.214463 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.214541 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.214781 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/40c8862e-a0c9-4bdc-9625-c5384c076732-kube-api-access-pvbpx\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.214940 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-utilities\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.214976 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-catalog-content\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.215508 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-catalog-content\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.215862 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-utilities\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.239298 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/40c8862e-a0c9-4bdc-9625-c5384c076732-kube-api-access-pvbpx\") pod \"community-operators-nqfvf\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.254487 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.306319 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.317986 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.318483 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55lf\" (UniqueName: \"kubernetes.io/projected/2ce95067-2a10-44bb-8179-b2663072dd42-kube-api-access-r55lf\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.318631 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-catalog-content\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.318870 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-utilities\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.333332 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bglks" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.336788 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.357341 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.359084 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.380423 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:58 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:58 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:58 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.380594 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.381516 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.381595 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.421089 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-utilities\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.433286 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55lf\" (UniqueName: \"kubernetes.io/projected/2ce95067-2a10-44bb-8179-b2663072dd42-kube-api-access-r55lf\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.433681 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-catalog-content\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.424153 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-utilities\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.448040 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-catalog-content\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.505939 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55lf\" (UniqueName: \"kubernetes.io/projected/2ce95067-2a10-44bb-8179-b2663072dd42-kube-api-access-r55lf\") pod \"certified-operators-mzbln\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.579255 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.709971 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" event={"ID":"90ee9175-d058-4b89-81c0-d7619908a777","Type":"ContainerStarted","Data":"29fb67e02c3a8026b8ba1e72e2b6bff30ae5b9eb6f41ce2523a9536fe4e3adf1"} Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.736625 4958 generic.go:334] "Generic (PLEG): container finished" podID="7eb4c2ff-f340-4277-8fe6-28b87034faa4" containerID="77a8335defdad5390dee64d4d904a489be239ba71de5b6a65faaf677f9224b63" exitCode=0 Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.737780 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb4c2ff-f340-4277-8fe6-28b87034faa4","Type":"ContainerDied","Data":"77a8335defdad5390dee64d4d904a489be239ba71de5b6a65faaf677f9224b63"} Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.758583 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pxgcs" podStartSLOduration=12.758555406 podStartE2EDuration="12.758555406s" podCreationTimestamp="2025-12-01 10:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:58.756227773 +0000 UTC m=+166.265016820" watchObservedRunningTime="2025-12-01 10:01:58.758555406 +0000 UTC m=+166.267344433" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.830007 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z5z9s"] Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.855449 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwjfj\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.864221 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.864278 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.864549 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.864571 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.930936 4958 patch_prober.go:28] interesting pod/apiserver-76f77b778f-dzllj container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]log ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]etcd ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/max-in-flight-filter ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 01 10:01:58 crc kubenswrapper[4958]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 01 10:01:58 crc kubenswrapper[4958]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/project.openshift.io-projectcache ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 01 10:01:58 crc kubenswrapper[4958]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 01 10:01:58 crc kubenswrapper[4958]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 10:01:58 crc kubenswrapper[4958]: livez check failed Dec 01 10:01:58 crc kubenswrapper[4958]: I1201 10:01:58.931029 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" podUID="d27f831c-bcde-4937-b3c6-0ada6053a268" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.124287 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.137763 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.145720 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.171289 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.203689 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9k5l5"] Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.266170 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.266225 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.271017 4958 patch_prober.go:28] interesting pod/console-f9d7485db-rxrcb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.271069 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rxrcb" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.339902 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:01:59 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:01:59 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:01:59 crc kubenswrapper[4958]: healthz check failed Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.340778 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.345460 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqfvf"] Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.460866 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzbln"] Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.757972 4958 generic.go:334] "Generic (PLEG): container finished" podID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerID="acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1" exitCode=0 Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.758109 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerDied","Data":"acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.758149 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerStarted","Data":"6cb109a0e73fa2e51570b5bd8fc175edb9cb9135507e6ea67e29a90e057f117f"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.761456 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.769132 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerStarted","Data":"aebfc997b2bf47670bc1d9bb44d5b14f92c23155b3c30f16e349e6d15318dc72"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.783239 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerStarted","Data":"59cc80a77a4876798df10415703953e541799a50e3b5f4ce236eace1fa2a8134"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.787524 4958 generic.go:334] "Generic (PLEG): container finished" podID="954a15b1-bcae-44b9-ad5b-e004ed4d79be" containerID="31900a7ee5f6e78fb0673cec346e8eee4d3a131eb91794d86e5fd79aa549e126" exitCode=0 Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.787657 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" event={"ID":"954a15b1-bcae-44b9-ad5b-e004ed4d79be","Type":"ContainerDied","Data":"31900a7ee5f6e78fb0673cec346e8eee4d3a131eb91794d86e5fd79aa549e126"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.795019 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerStarted","Data":"630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.795077 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerStarted","Data":"eaa57abd71f2847084b7e3da5c033662a056024569580b5040f172dac09beba3"} Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.847999 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.848978 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwjfj"] Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.849053 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lrtgp" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.977239 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fdr2q"] Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.979630 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.985710 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:01:59 crc kubenswrapper[4958]: I1201 10:01:59.994302 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdr2q"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.025024 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zch\" (UniqueName: \"kubernetes.io/projected/ba3175fa-f3c1-4377-9489-ef0fbed933bc-kube-api-access-q5zch\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.025150 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-catalog-content\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.025228 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-utilities\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.127600 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-utilities\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.127695 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zch\" (UniqueName: \"kubernetes.io/projected/ba3175fa-f3c1-4377-9489-ef0fbed933bc-kube-api-access-q5zch\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.127816 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-catalog-content\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.128501 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-catalog-content\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.128855 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-utilities\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.182666 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zch\" (UniqueName: \"kubernetes.io/projected/ba3175fa-f3c1-4377-9489-ef0fbed933bc-kube-api-access-q5zch\") pod \"redhat-marketplace-fdr2q\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.347463 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:00 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:00 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:00 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.347570 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.353264 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.372305 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ztnzr"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.374137 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.393666 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztnzr"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.437785 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-catalog-content\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.437837 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-utilities\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.437920 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkpl\" (UniqueName: \"kubernetes.io/projected/a167bce5-ac1c-4cba-b065-5458a7840cd2-kube-api-access-pxkpl\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.544217 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-utilities\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.544737 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkpl\" (UniqueName: \"kubernetes.io/projected/a167bce5-ac1c-4cba-b065-5458a7840cd2-kube-api-access-pxkpl\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.544818 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.544997 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-catalog-content\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.545424 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-utilities\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.546061 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-catalog-content\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.559522 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/987c6a26-52be-40a5-b9cc-456d9731436f-metrics-certs\") pod \"network-metrics-daemon-6b9wz\" (UID: \"987c6a26-52be-40a5-b9cc-456d9731436f\") " pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.580050 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkpl\" (UniqueName: \"kubernetes.io/projected/a167bce5-ac1c-4cba-b065-5458a7840cd2-kube-api-access-pxkpl\") pod \"redhat-marketplace-ztnzr\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.639659 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.645877 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kube-api-access\") pod \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.646130 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kubelet-dir\") pod \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\" (UID: \"7eb4c2ff-f340-4277-8fe6-28b87034faa4\") " Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.646265 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7eb4c2ff-f340-4277-8fe6-28b87034faa4" (UID: "7eb4c2ff-f340-4277-8fe6-28b87034faa4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.646476 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.662135 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7eb4c2ff-f340-4277-8fe6-28b87034faa4" (UID: "7eb4c2ff-f340-4277-8fe6-28b87034faa4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.742781 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j5m4k"] Dec 01 10:02:00 crc kubenswrapper[4958]: E1201 10:02:00.743138 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb4c2ff-f340-4277-8fe6-28b87034faa4" containerName="pruner" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.743163 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb4c2ff-f340-4277-8fe6-28b87034faa4" containerName="pruner" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.743329 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb4c2ff-f340-4277-8fe6-28b87034faa4" containerName="pruner" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.748433 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.749429 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb4c2ff-f340-4277-8fe6-28b87034faa4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.756554 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.784445 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.820231 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5m4k"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.831541 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6b9wz" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.837838 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb4c2ff-f340-4277-8fe6-28b87034faa4","Type":"ContainerDied","Data":"108929bceb105f0cf14a86b9da1ae4bc1afc7503f823751550c211342ed75b3c"} Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.837955 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108929bceb105f0cf14a86b9da1ae4bc1afc7503f823751550c211342ed75b3c" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.838110 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.853036 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-utilities\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.853108 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9b7\" (UniqueName: \"kubernetes.io/projected/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-kube-api-access-bx9b7\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.853144 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-catalog-content\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.853487 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.854567 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.870703 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.871031 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.871339 4958 generic.go:334] "Generic (PLEG): container finished" podID="2ce95067-2a10-44bb-8179-b2663072dd42" containerID="17c58ae59eebf353bf86a92748cbf9e98fd971b5ab703f2f6037bba64af4164a" exitCode=0 Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.871471 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerDied","Data":"17c58ae59eebf353bf86a92748cbf9e98fd971b5ab703f2f6037bba64af4164a"} Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.887048 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.896987 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" event={"ID":"4fe0fe33-c2ce-4622-b6f4-3f587718b006","Type":"ContainerStarted","Data":"634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593"} Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.897060 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" event={"ID":"4fe0fe33-c2ce-4622-b6f4-3f587718b006","Type":"ContainerStarted","Data":"d0426679f2990325b31969797ca22d440a92bbf788e864992e2231ed29a65081"} Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.897998 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.923507 4958 generic.go:334] "Generic (PLEG): container finished" podID="2b165b67-cdac-49bb-969f-aff516fa216d" containerID="630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f" exitCode=0 Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.923642 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerDied","Data":"630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f"} Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.931378 4958 generic.go:334] "Generic (PLEG): container finished" podID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerID="bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20" exitCode=0 Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.932668 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerDied","Data":"bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20"} Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.940687 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2w7kl"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.953457 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.968375 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-utilities\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.968459 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9b7\" (UniqueName: \"kubernetes.io/projected/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-kube-api-access-bx9b7\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.968507 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-catalog-content\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.969968 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2w7kl"] Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.970485 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-utilities\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:00 crc kubenswrapper[4958]: I1201 10:02:00.970648 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-catalog-content\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.030393 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9b7\" (UniqueName: \"kubernetes.io/projected/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-kube-api-access-bx9b7\") pod \"redhat-operators-j5m4k\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.056518 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" podStartSLOduration=143.056481041 podStartE2EDuration="2m23.056481041s" podCreationTimestamp="2025-12-01 09:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:02:01.05388146 +0000 UTC m=+168.562670497" watchObservedRunningTime="2025-12-01 10:02:01.056481041 +0000 UTC m=+168.565270078" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.071162 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-utilities\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.071318 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-catalog-content\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.071414 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/566da22b-642a-469f-a461-4daf10214f16-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.071540 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/566da22b-642a-469f-a461-4daf10214f16-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.071823 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfhst\" (UniqueName: \"kubernetes.io/projected/15c34271-3e36-4e77-bc38-f06b06d499f6-kube-api-access-vfhst\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.089930 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.179503 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/566da22b-642a-469f-a461-4daf10214f16-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.179642 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfhst\" (UniqueName: \"kubernetes.io/projected/15c34271-3e36-4e77-bc38-f06b06d499f6-kube-api-access-vfhst\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.179703 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-utilities\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.179729 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-catalog-content\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.179758 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/566da22b-642a-469f-a461-4daf10214f16-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.180329 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/566da22b-642a-469f-a461-4daf10214f16-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.181004 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-utilities\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.181110 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-catalog-content\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.192669 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdr2q"] Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.205790 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/566da22b-642a-469f-a461-4daf10214f16-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.206573 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfhst\" (UniqueName: \"kubernetes.io/projected/15c34271-3e36-4e77-bc38-f06b06d499f6-kube-api-access-vfhst\") pod \"redhat-operators-2w7kl\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.224913 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:01 crc kubenswrapper[4958]: W1201 10:02:01.268981 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3175fa_f3c1_4377_9489_ef0fbed933bc.slice/crio-a1ea33e24bd76cab747bdf5a02afadafeccfec2c52d1a85457a5917e0cd02484 WatchSource:0}: Error finding container a1ea33e24bd76cab747bdf5a02afadafeccfec2c52d1a85457a5917e0cd02484: Status 404 returned error can't find the container with id a1ea33e24bd76cab747bdf5a02afadafeccfec2c52d1a85457a5917e0cd02484 Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.296120 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.344782 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:01 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:01 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:01 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.344907 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.477421 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6b9wz"] Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.555464 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.565217 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztnzr"] Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.697769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5m4k"] Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.699416 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff847\" (UniqueName: \"kubernetes.io/projected/954a15b1-bcae-44b9-ad5b-e004ed4d79be-kube-api-access-ff847\") pod \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.699551 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954a15b1-bcae-44b9-ad5b-e004ed4d79be-config-volume\") pod \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.699649 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954a15b1-bcae-44b9-ad5b-e004ed4d79be-secret-volume\") pod \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\" (UID: \"954a15b1-bcae-44b9-ad5b-e004ed4d79be\") " Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.701677 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954a15b1-bcae-44b9-ad5b-e004ed4d79be-config-volume" (OuterVolumeSpecName: "config-volume") pod "954a15b1-bcae-44b9-ad5b-e004ed4d79be" (UID: "954a15b1-bcae-44b9-ad5b-e004ed4d79be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.709249 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954a15b1-bcae-44b9-ad5b-e004ed4d79be-kube-api-access-ff847" (OuterVolumeSpecName: "kube-api-access-ff847") pod "954a15b1-bcae-44b9-ad5b-e004ed4d79be" (UID: "954a15b1-bcae-44b9-ad5b-e004ed4d79be"). InnerVolumeSpecName "kube-api-access-ff847". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.710391 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/954a15b1-bcae-44b9-ad5b-e004ed4d79be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "954a15b1-bcae-44b9-ad5b-e004ed4d79be" (UID: "954a15b1-bcae-44b9-ad5b-e004ed4d79be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:02:01 crc kubenswrapper[4958]: W1201 10:02:01.762980 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56eb44a_f6f7_4b29_be59_84e10ab7c34a.slice/crio-75b010d0861591344f4e3c5035f247b24cee9c76c97b2d598f5f2b7f2ea5dccd WatchSource:0}: Error finding container 75b010d0861591344f4e3c5035f247b24cee9c76c97b2d598f5f2b7f2ea5dccd: Status 404 returned error can't find the container with id 75b010d0861591344f4e3c5035f247b24cee9c76c97b2d598f5f2b7f2ea5dccd Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.763449 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.782392 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2w7kl"] Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.801366 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954a15b1-bcae-44b9-ad5b-e004ed4d79be-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.801413 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff847\" (UniqueName: \"kubernetes.io/projected/954a15b1-bcae-44b9-ad5b-e004ed4d79be-kube-api-access-ff847\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.801428 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954a15b1-bcae-44b9-ad5b-e004ed4d79be-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:01 crc kubenswrapper[4958]: W1201 10:02:01.842308 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod566da22b_642a_469f_a461_4daf10214f16.slice/crio-4a0c838fb20e0bcaaa4b12ce030dd32b02ff8f1019a2b0eeb4b81834f32cfd69 WatchSource:0}: Error finding container 4a0c838fb20e0bcaaa4b12ce030dd32b02ff8f1019a2b0eeb4b81834f32cfd69: Status 404 returned error can't find the container with id 4a0c838fb20e0bcaaa4b12ce030dd32b02ff8f1019a2b0eeb4b81834f32cfd69 Dec 01 10:02:01 crc kubenswrapper[4958]: W1201 10:02:01.854098 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15c34271_3e36_4e77_bc38_f06b06d499f6.slice/crio-797e10a77307e8b435bc08565d24be3647153a948e8f7ea10f850ddf8cb03ef4 WatchSource:0}: Error finding container 797e10a77307e8b435bc08565d24be3647153a948e8f7ea10f850ddf8cb03ef4: Status 404 returned error can't find the container with id 797e10a77307e8b435bc08565d24be3647153a948e8f7ea10f850ddf8cb03ef4 Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.957771 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerStarted","Data":"a1ea33e24bd76cab747bdf5a02afadafeccfec2c52d1a85457a5917e0cd02484"} Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.959207 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerStarted","Data":"797e10a77307e8b435bc08565d24be3647153a948e8f7ea10f850ddf8cb03ef4"} Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.963463 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerStarted","Data":"75b010d0861591344f4e3c5035f247b24cee9c76c97b2d598f5f2b7f2ea5dccd"} Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.972358 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerStarted","Data":"ee318a882dd05c359663798f1b248b443b3359cbcd269e57ff400766acd45860"} Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.983670 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.983698 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5" event={"ID":"954a15b1-bcae-44b9-ad5b-e004ed4d79be","Type":"ContainerDied","Data":"96ecb5dc151435086a7c80cdf76057000cbf33fea44791891921623e4048f3ef"} Dec 01 10:02:01 crc kubenswrapper[4958]: I1201 10:02:01.984496 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ecb5dc151435086a7c80cdf76057000cbf33fea44791891921623e4048f3ef" Dec 01 10:02:02 crc kubenswrapper[4958]: I1201 10:02:02.001046 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" event={"ID":"987c6a26-52be-40a5-b9cc-456d9731436f","Type":"ContainerStarted","Data":"34651aa9e75a0261a327ddd1b65b0c925dfa2d87dff5ed58616f1b78a600a0c6"} Dec 01 10:02:02 crc kubenswrapper[4958]: I1201 10:02:02.013933 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"566da22b-642a-469f-a461-4daf10214f16","Type":"ContainerStarted","Data":"4a0c838fb20e0bcaaa4b12ce030dd32b02ff8f1019a2b0eeb4b81834f32cfd69"} Dec 01 10:02:02 crc kubenswrapper[4958]: I1201 10:02:02.340542 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:02 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:02 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:02 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:02 crc kubenswrapper[4958]: I1201 10:02:02.340649 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.035546 4958 generic.go:334] "Generic (PLEG): container finished" podID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerID="80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7" exitCode=0 Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.036070 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerDied","Data":"80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.043980 4958 generic.go:334] "Generic (PLEG): container finished" podID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerID="d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057" exitCode=0 Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.044081 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerDied","Data":"d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.049074 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" event={"ID":"987c6a26-52be-40a5-b9cc-456d9731436f","Type":"ContainerStarted","Data":"fafc77ad3a0c08a361ab9d4bf52793f0989d6b165d43d975e639e85f79cad130"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.049127 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6b9wz" event={"ID":"987c6a26-52be-40a5-b9cc-456d9731436f","Type":"ContainerStarted","Data":"176d340dd393fa7f4fbd2445f53fc80ca80f6f6e306266a38381a847cf3e4f64"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.052554 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"566da22b-642a-469f-a461-4daf10214f16","Type":"ContainerStarted","Data":"e370b74abdb5bf18729effab121aa1f0a86587ddf8541067565f870b32cbbd77"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.058436 4958 generic.go:334] "Generic (PLEG): container finished" podID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerID="33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21" exitCode=0 Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.058529 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerDied","Data":"33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.068957 4958 generic.go:334] "Generic (PLEG): container finished" podID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerID="ef7669644e0dfa31000d4d392d3d75c47c67f277ae1e21f0f6639d71bcbd6171" exitCode=0 Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.069036 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerDied","Data":"ef7669644e0dfa31000d4d392d3d75c47c67f277ae1e21f0f6639d71bcbd6171"} Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.158571 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.158541132 podStartE2EDuration="3.158541132s" podCreationTimestamp="2025-12-01 10:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:02:03.157619873 +0000 UTC m=+170.666408920" watchObservedRunningTime="2025-12-01 10:02:03.158541132 +0000 UTC m=+170.667330169" Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.197298 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6b9wz" podStartSLOduration=146.197260988 podStartE2EDuration="2m26.197260988s" podCreationTimestamp="2025-12-01 09:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:02:03.184010745 +0000 UTC m=+170.692799782" watchObservedRunningTime="2025-12-01 10:02:03.197260988 +0000 UTC m=+170.706050025" Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.337138 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:03 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:03 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:03 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.337243 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.368470 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:02:03 crc kubenswrapper[4958]: I1201 10:02:03.378619 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-dzllj" Dec 01 10:02:04 crc kubenswrapper[4958]: I1201 10:02:04.139347 4958 generic.go:334] "Generic (PLEG): container finished" podID="566da22b-642a-469f-a461-4daf10214f16" containerID="e370b74abdb5bf18729effab121aa1f0a86587ddf8541067565f870b32cbbd77" exitCode=0 Dec 01 10:02:04 crc kubenswrapper[4958]: I1201 10:02:04.140028 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"566da22b-642a-469f-a461-4daf10214f16","Type":"ContainerDied","Data":"e370b74abdb5bf18729effab121aa1f0a86587ddf8541067565f870b32cbbd77"} Dec 01 10:02:04 crc kubenswrapper[4958]: I1201 10:02:04.338837 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:04 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:04 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:04 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:04 crc kubenswrapper[4958]: I1201 10:02:04.338948 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:04 crc kubenswrapper[4958]: I1201 10:02:04.550049 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4khgc" Dec 01 10:02:05 crc kubenswrapper[4958]: I1201 10:02:05.335267 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:05 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:05 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:05 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:05 crc kubenswrapper[4958]: I1201 10:02:05.335813 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:05 crc kubenswrapper[4958]: I1201 10:02:05.919211 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.021657 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/566da22b-642a-469f-a461-4daf10214f16-kubelet-dir\") pod \"566da22b-642a-469f-a461-4daf10214f16\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.021876 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/566da22b-642a-469f-a461-4daf10214f16-kube-api-access\") pod \"566da22b-642a-469f-a461-4daf10214f16\" (UID: \"566da22b-642a-469f-a461-4daf10214f16\") " Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.021868 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/566da22b-642a-469f-a461-4daf10214f16-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "566da22b-642a-469f-a461-4daf10214f16" (UID: "566da22b-642a-469f-a461-4daf10214f16"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.022304 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/566da22b-642a-469f-a461-4daf10214f16-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.090959 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566da22b-642a-469f-a461-4daf10214f16-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "566da22b-642a-469f-a461-4daf10214f16" (UID: "566da22b-642a-469f-a461-4daf10214f16"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.124042 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/566da22b-642a-469f-a461-4daf10214f16-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.358112 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:06 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:06 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:06 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.358208 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.363317 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"566da22b-642a-469f-a461-4daf10214f16","Type":"ContainerDied","Data":"4a0c838fb20e0bcaaa4b12ce030dd32b02ff8f1019a2b0eeb4b81834f32cfd69"} Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.363365 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a0c838fb20e0bcaaa4b12ce030dd32b02ff8f1019a2b0eeb4b81834f32cfd69" Dec 01 10:02:06 crc kubenswrapper[4958]: I1201 10:02:06.363445 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 10:02:07 crc kubenswrapper[4958]: I1201 10:02:07.371242 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:07 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:07 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:07 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:07 crc kubenswrapper[4958]: I1201 10:02:07.371351 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:08 crc kubenswrapper[4958]: I1201 10:02:08.481645 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:08 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:08 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:08 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:08 crc kubenswrapper[4958]: I1201 10:02:08.482142 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:08 crc kubenswrapper[4958]: I1201 10:02:08.865751 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:08 crc kubenswrapper[4958]: I1201 10:02:08.865905 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:08 crc kubenswrapper[4958]: I1201 10:02:08.866883 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:08 crc kubenswrapper[4958]: I1201 10:02:08.867010 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:09 crc kubenswrapper[4958]: I1201 10:02:09.309495 4958 patch_prober.go:28] interesting pod/console-f9d7485db-rxrcb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 01 10:02:09 crc kubenswrapper[4958]: I1201 10:02:09.309583 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rxrcb" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 01 10:02:09 crc kubenswrapper[4958]: I1201 10:02:09.874934 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:09 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:09 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:09 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:09 crc kubenswrapper[4958]: I1201 10:02:09.875167 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:10 crc kubenswrapper[4958]: I1201 10:02:10.400434 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 10:02:10 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Dec 01 10:02:10 crc kubenswrapper[4958]: [+]process-running ok Dec 01 10:02:10 crc kubenswrapper[4958]: healthz check failed Dec 01 10:02:10 crc kubenswrapper[4958]: I1201 10:02:10.400516 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:02:11 crc kubenswrapper[4958]: I1201 10:02:11.336495 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:02:11 crc kubenswrapper[4958]: I1201 10:02:11.342185 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-swd9j" Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.865223 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.866030 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.866110 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.865364 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.866798 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.867008 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"0135f3300f4f07f811445240f4cc2c1488620fc4c24df50c244748f6d2e519f3"} pod="openshift-console/downloads-7954f5f757-g9fjl" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.867150 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" containerID="cri-o://0135f3300f4f07f811445240f4cc2c1488620fc4c24df50c244748f6d2e519f3" gracePeriod=2 Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.867162 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:18 crc kubenswrapper[4958]: I1201 10:02:18.867194 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:19 crc kubenswrapper[4958]: I1201 10:02:19.494695 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:02:19 crc kubenswrapper[4958]: I1201 10:02:19.636442 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:02:19 crc kubenswrapper[4958]: I1201 10:02:19.644416 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:02:20 crc kubenswrapper[4958]: I1201 10:02:20.661822 4958 generic.go:334] "Generic (PLEG): container finished" podID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerID="0135f3300f4f07f811445240f4cc2c1488620fc4c24df50c244748f6d2e519f3" exitCode=0 Dec 01 10:02:20 crc kubenswrapper[4958]: I1201 10:02:20.661907 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-g9fjl" event={"ID":"81ffe673-ed64-4dd2-81c3-65c922af39bc","Type":"ContainerDied","Data":"0135f3300f4f07f811445240f4cc2c1488620fc4c24df50c244748f6d2e519f3"} Dec 01 10:02:24 crc kubenswrapper[4958]: I1201 10:02:24.837268 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 10:02:28 crc kubenswrapper[4958]: I1201 10:02:28.210511 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:02:28 crc kubenswrapper[4958]: I1201 10:02:28.210942 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:02:28 crc kubenswrapper[4958]: I1201 10:02:28.866410 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:28 crc kubenswrapper[4958]: I1201 10:02:28.866497 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:29 crc kubenswrapper[4958]: I1201 10:02:29.375904 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q7r9c" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.175019 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:02:36 crc kubenswrapper[4958]: E1201 10:02:36.176235 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566da22b-642a-469f-a461-4daf10214f16" containerName="pruner" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.176254 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="566da22b-642a-469f-a461-4daf10214f16" containerName="pruner" Dec 01 10:02:36 crc kubenswrapper[4958]: E1201 10:02:36.176276 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954a15b1-bcae-44b9-ad5b-e004ed4d79be" containerName="collect-profiles" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.176285 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="954a15b1-bcae-44b9-ad5b-e004ed4d79be" containerName="collect-profiles" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.176437 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="566da22b-642a-469f-a461-4daf10214f16" containerName="pruner" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.176454 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="954a15b1-bcae-44b9-ad5b-e004ed4d79be" containerName="collect-profiles" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.177486 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.183809 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.220076 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.220397 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.222075 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/009f7729-a27a-415b-a008-cc328985af59-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.222180 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/009f7729-a27a-415b-a008-cc328985af59-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.324823 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/009f7729-a27a-415b-a008-cc328985af59-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.324956 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/009f7729-a27a-415b-a008-cc328985af59-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.324954 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/009f7729-a27a-415b-a008-cc328985af59-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.395353 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/009f7729-a27a-415b-a008-cc328985af59-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:36 crc kubenswrapper[4958]: I1201 10:02:36.551576 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:38 crc kubenswrapper[4958]: I1201 10:02:38.866895 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:38 crc kubenswrapper[4958]: I1201 10:02:38.867372 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:39 crc kubenswrapper[4958]: E1201 10:02:39.854721 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 10:02:39 crc kubenswrapper[4958]: E1201 10:02:39.855014 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ns986,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9k5l5_openshift-marketplace(2b165b67-cdac-49bb-969f-aff516fa216d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:39 crc kubenswrapper[4958]: E1201 10:02:39.856107 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9k5l5" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" Dec 01 10:02:40 crc kubenswrapper[4958]: E1201 10:02:40.973543 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9k5l5" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" Dec 01 10:02:41 crc kubenswrapper[4958]: E1201 10:02:41.065111 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 10:02:41 crc kubenswrapper[4958]: E1201 10:02:41.065400 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q5zch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fdr2q_openshift-marketplace(ba3175fa-f3c1-4377-9489-ef0fbed933bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:41 crc kubenswrapper[4958]: E1201 10:02:41.066674 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fdr2q" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.571132 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.572708 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.579113 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.726491 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.726580 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5867f258-1174-4cc1-b035-a68f57724d88-kube-api-access\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.726813 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-var-lock\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.828413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-var-lock\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.828565 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.828613 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5867f258-1174-4cc1-b035-a68f57724d88-kube-api-access\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.829174 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.829174 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-var-lock\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.852266 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5867f258-1174-4cc1-b035-a68f57724d88-kube-api-access\") pod \"installer-9-crc\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:41 crc kubenswrapper[4958]: I1201 10:02:41.902431 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:02:43 crc kubenswrapper[4958]: E1201 10:02:43.775752 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fdr2q" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" Dec 01 10:02:43 crc kubenswrapper[4958]: E1201 10:02:43.876982 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 10:02:43 crc kubenswrapper[4958]: E1201 10:02:43.877393 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pvbpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nqfvf_openshift-marketplace(40c8862e-a0c9-4bdc-9625-c5384c076732): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:43 crc kubenswrapper[4958]: E1201 10:02:43.878699 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nqfvf" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.497267 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nqfvf" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.592140 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.592564 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vfhst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2w7kl_openshift-marketplace(15c34271-3e36-4e77-bc38-f06b06d499f6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.594672 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2w7kl" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.725076 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.726066 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bx9b7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-j5m4k_openshift-marketplace(d56eb44a-f6f7-4b29-be59-84e10ab7c34a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.727719 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-j5m4k" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.738338 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.738551 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c5f8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-z5z9s_openshift-marketplace(144ccc0c-43b4-4146-a91a-b235916a6a0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.739840 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-z5z9s" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.813163 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.813401 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pxkpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ztnzr_openshift-marketplace(a167bce5-ac1c-4cba-b065-5458a7840cd2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.814740 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ztnzr" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.865250 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.865749 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.992779 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerStarted","Data":"88067881ad5f2d1a6e14d8908f6a5cc4f4564bf37c0517b873709e7fce64d75c"} Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.995473 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-g9fjl" event={"ID":"81ffe673-ed64-4dd2-81c3-65c922af39bc","Type":"ContainerStarted","Data":"ccb761f33ce922e406074ffe73528a764a53059d314b9a6f920ceabff9526270"} Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.996074 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.997415 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:48 crc kubenswrapper[4958]: I1201 10:02:48.997454 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.997923 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-j5m4k" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.998175 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ztnzr" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" Dec 01 10:02:48 crc kubenswrapper[4958]: E1201 10:02:48.998229 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2w7kl" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" Dec 01 10:02:49 crc kubenswrapper[4958]: E1201 10:02:49.003304 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-z5z9s" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" Dec 01 10:02:49 crc kubenswrapper[4958]: I1201 10:02:49.080698 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 10:02:49 crc kubenswrapper[4958]: W1201 10:02:49.098295 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod009f7729_a27a_415b_a008_cc328985af59.slice/crio-07de6649c9516789aa646ec1e7796b9f7cf8073db152702dbaf649dbcf0f8b7d WatchSource:0}: Error finding container 07de6649c9516789aa646ec1e7796b9f7cf8073db152702dbaf649dbcf0f8b7d: Status 404 returned error can't find the container with id 07de6649c9516789aa646ec1e7796b9f7cf8073db152702dbaf649dbcf0f8b7d Dec 01 10:02:49 crc kubenswrapper[4958]: I1201 10:02:49.163297 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.003661 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"009f7729-a27a-415b-a008-cc328985af59","Type":"ContainerStarted","Data":"4c360ae1fdf3272585b0f90604f9daeaea1b15169a1ad4b48d0517fcd1fc0952"} Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.004722 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"009f7729-a27a-415b-a008-cc328985af59","Type":"ContainerStarted","Data":"07de6649c9516789aa646ec1e7796b9f7cf8073db152702dbaf649dbcf0f8b7d"} Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.005586 4958 generic.go:334] "Generic (PLEG): container finished" podID="2ce95067-2a10-44bb-8179-b2663072dd42" containerID="88067881ad5f2d1a6e14d8908f6a5cc4f4564bf37c0517b873709e7fce64d75c" exitCode=0 Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.005671 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerDied","Data":"88067881ad5f2d1a6e14d8908f6a5cc4f4564bf37c0517b873709e7fce64d75c"} Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.007120 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5867f258-1174-4cc1-b035-a68f57724d88","Type":"ContainerStarted","Data":"602990710fc705034fe1a384369a94c7b603e2e345135fc88c3757f40732cab0"} Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.007158 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5867f258-1174-4cc1-b035-a68f57724d88","Type":"ContainerStarted","Data":"495b4e54bd384ff76d7386a96a24682b7b362ab4d6498c1ba48ff75eec0fd01a"} Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.007963 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.008064 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.030053 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=14.030013291 podStartE2EDuration="14.030013291s" podCreationTimestamp="2025-12-01 10:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:02:50.023065066 +0000 UTC m=+217.531854103" watchObservedRunningTime="2025-12-01 10:02:50.030013291 +0000 UTC m=+217.538802328" Dec 01 10:02:50 crc kubenswrapper[4958]: I1201 10:02:50.069649 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.069619368 podStartE2EDuration="9.069619368s" podCreationTimestamp="2025-12-01 10:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:02:50.068987209 +0000 UTC m=+217.577776256" watchObservedRunningTime="2025-12-01 10:02:50.069619368 +0000 UTC m=+217.578408405" Dec 01 10:02:51 crc kubenswrapper[4958]: I1201 10:02:51.015164 4958 generic.go:334] "Generic (PLEG): container finished" podID="009f7729-a27a-415b-a008-cc328985af59" containerID="4c360ae1fdf3272585b0f90604f9daeaea1b15169a1ad4b48d0517fcd1fc0952" exitCode=0 Dec 01 10:02:51 crc kubenswrapper[4958]: I1201 10:02:51.015268 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"009f7729-a27a-415b-a008-cc328985af59","Type":"ContainerDied","Data":"4c360ae1fdf3272585b0f90604f9daeaea1b15169a1ad4b48d0517fcd1fc0952"} Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.574867 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.746717 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/009f7729-a27a-415b-a008-cc328985af59-kubelet-dir\") pod \"009f7729-a27a-415b-a008-cc328985af59\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.747031 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/009f7729-a27a-415b-a008-cc328985af59-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "009f7729-a27a-415b-a008-cc328985af59" (UID: "009f7729-a27a-415b-a008-cc328985af59"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.747265 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/009f7729-a27a-415b-a008-cc328985af59-kube-api-access\") pod \"009f7729-a27a-415b-a008-cc328985af59\" (UID: \"009f7729-a27a-415b-a008-cc328985af59\") " Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.747656 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/009f7729-a27a-415b-a008-cc328985af59-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.756134 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009f7729-a27a-415b-a008-cc328985af59-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "009f7729-a27a-415b-a008-cc328985af59" (UID: "009f7729-a27a-415b-a008-cc328985af59"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:02:52 crc kubenswrapper[4958]: I1201 10:02:52.885737 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/009f7729-a27a-415b-a008-cc328985af59-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:02:53 crc kubenswrapper[4958]: I1201 10:02:53.033819 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"009f7729-a27a-415b-a008-cc328985af59","Type":"ContainerDied","Data":"07de6649c9516789aa646ec1e7796b9f7cf8073db152702dbaf649dbcf0f8b7d"} Dec 01 10:02:53 crc kubenswrapper[4958]: I1201 10:02:53.033893 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07de6649c9516789aa646ec1e7796b9f7cf8073db152702dbaf649dbcf0f8b7d" Dec 01 10:02:53 crc kubenswrapper[4958]: I1201 10:02:53.033941 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 10:02:54 crc kubenswrapper[4958]: I1201 10:02:54.056250 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerStarted","Data":"dca97a26a1bed69b598626ac5cda44fa6126e749d36a6fcdfb5680eff492d4be"} Dec 01 10:02:54 crc kubenswrapper[4958]: I1201 10:02:54.083090 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mzbln" podStartSLOduration=4.287636836 podStartE2EDuration="56.083051207s" podCreationTimestamp="2025-12-01 10:01:58 +0000 UTC" firstStartedPulling="2025-12-01 10:02:00.897262159 +0000 UTC m=+168.406051196" lastFinishedPulling="2025-12-01 10:02:52.69267653 +0000 UTC m=+220.201465567" observedRunningTime="2025-12-01 10:02:54.079819327 +0000 UTC m=+221.588608364" watchObservedRunningTime="2025-12-01 10:02:54.083051207 +0000 UTC m=+221.591840244" Dec 01 10:02:57 crc kubenswrapper[4958]: I1201 10:02:57.877829 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tp8nm"] Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.132085 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerStarted","Data":"9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58"} Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.222411 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.222604 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.222679 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.223642 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.223748 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214" gracePeriod=600 Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.581479 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.582013 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.864957 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.865038 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.864958 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-g9fjl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Dec 01 10:02:58 crc kubenswrapper[4958]: I1201 10:02:58.865449 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-g9fjl" podUID="81ffe673-ed64-4dd2-81c3-65c922af39bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Dec 01 10:02:59 crc kubenswrapper[4958]: I1201 10:02:59.140410 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214" exitCode=0 Dec 01 10:02:59 crc kubenswrapper[4958]: I1201 10:02:59.140469 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214"} Dec 01 10:03:00 crc kubenswrapper[4958]: I1201 10:03:00.149138 4958 generic.go:334] "Generic (PLEG): container finished" podID="2b165b67-cdac-49bb-969f-aff516fa216d" containerID="9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58" exitCode=0 Dec 01 10:03:00 crc kubenswrapper[4958]: I1201 10:03:00.149249 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerDied","Data":"9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58"} Dec 01 10:03:00 crc kubenswrapper[4958]: I1201 10:03:00.154353 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"05ba5424dacdb14134224ea91b05efb8f315f1c84935a65ba19d25bc0a734c9b"} Dec 01 10:03:00 crc kubenswrapper[4958]: I1201 10:03:00.519071 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mzbln" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="registry-server" probeResult="failure" output=< Dec 01 10:03:00 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 10:03:00 crc kubenswrapper[4958]: > Dec 01 10:03:03 crc kubenswrapper[4958]: I1201 10:03:03.174355 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerStarted","Data":"55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3"} Dec 01 10:03:03 crc kubenswrapper[4958]: I1201 10:03:03.176668 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerStarted","Data":"b949e2826afc0f021e861409a564bf800db15eb9579104b9afe50e01b4c1dd5c"} Dec 01 10:03:03 crc kubenswrapper[4958]: I1201 10:03:03.178197 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerStarted","Data":"9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497"} Dec 01 10:03:04 crc kubenswrapper[4958]: I1201 10:03:04.477329 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerStarted","Data":"10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3"} Dec 01 10:03:04 crc kubenswrapper[4958]: I1201 10:03:04.481795 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerStarted","Data":"1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653"} Dec 01 10:03:05 crc kubenswrapper[4958]: I1201 10:03:05.516463 4958 generic.go:334] "Generic (PLEG): container finished" podID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerID="10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3" exitCode=0 Dec 01 10:03:05 crc kubenswrapper[4958]: I1201 10:03:05.516581 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerDied","Data":"10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3"} Dec 01 10:03:05 crc kubenswrapper[4958]: I1201 10:03:05.519352 4958 generic.go:334] "Generic (PLEG): container finished" podID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerID="55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3" exitCode=0 Dec 01 10:03:05 crc kubenswrapper[4958]: I1201 10:03:05.520727 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerDied","Data":"55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3"} Dec 01 10:03:05 crc kubenswrapper[4958]: I1201 10:03:05.544163 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9k5l5" podStartSLOduration=6.25728823 podStartE2EDuration="1m8.544135957s" podCreationTimestamp="2025-12-01 10:01:57 +0000 UTC" firstStartedPulling="2025-12-01 10:01:59.801792449 +0000 UTC m=+167.310581486" lastFinishedPulling="2025-12-01 10:03:02.088640176 +0000 UTC m=+229.597429213" observedRunningTime="2025-12-01 10:03:04.537587421 +0000 UTC m=+232.046376458" watchObservedRunningTime="2025-12-01 10:03:05.544135957 +0000 UTC m=+233.052924994" Dec 01 10:03:07 crc kubenswrapper[4958]: I1201 10:03:07.540200 4958 generic.go:334] "Generic (PLEG): container finished" podID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerID="1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653" exitCode=0 Dec 01 10:03:07 crc kubenswrapper[4958]: I1201 10:03:07.540289 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerDied","Data":"1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653"} Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.170160 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.170259 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.233577 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.549077 4958 generic.go:334] "Generic (PLEG): container finished" podID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerID="b949e2826afc0f021e861409a564bf800db15eb9579104b9afe50e01b4c1dd5c" exitCode=0 Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.549157 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerDied","Data":"b949e2826afc0f021e861409a564bf800db15eb9579104b9afe50e01b4c1dd5c"} Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.552857 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerStarted","Data":"61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469"} Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.605592 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.627928 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.678535 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:03:08 crc kubenswrapper[4958]: I1201 10:03:08.877297 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-g9fjl" Dec 01 10:03:09 crc kubenswrapper[4958]: I1201 10:03:09.563211 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerStarted","Data":"0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6"} Dec 01 10:03:09 crc kubenswrapper[4958]: I1201 10:03:09.635338 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ztnzr" podStartSLOduration=5.102906565 podStartE2EDuration="1m9.635299884s" podCreationTimestamp="2025-12-01 10:02:00 +0000 UTC" firstStartedPulling="2025-12-01 10:02:03.046772679 +0000 UTC m=+170.555561716" lastFinishedPulling="2025-12-01 10:03:07.579165998 +0000 UTC m=+235.087955035" observedRunningTime="2025-12-01 10:03:09.632789516 +0000 UTC m=+237.141578553" watchObservedRunningTime="2025-12-01 10:03:09.635299884 +0000 UTC m=+237.144088921" Dec 01 10:03:10 crc kubenswrapper[4958]: I1201 10:03:10.785298 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:03:10 crc kubenswrapper[4958]: I1201 10:03:10.785649 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:03:11 crc kubenswrapper[4958]: I1201 10:03:11.631404 4958 generic.go:334] "Generic (PLEG): container finished" podID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerID="61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469" exitCode=0 Dec 01 10:03:11 crc kubenswrapper[4958]: I1201 10:03:11.631473 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerDied","Data":"61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469"} Dec 01 10:03:11 crc kubenswrapper[4958]: I1201 10:03:11.935546 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ztnzr" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="registry-server" probeResult="failure" output=< Dec 01 10:03:11 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 10:03:11 crc kubenswrapper[4958]: > Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.440037 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzbln"] Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.441200 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mzbln" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="registry-server" containerID="cri-o://dca97a26a1bed69b598626ac5cda44fa6126e749d36a6fcdfb5680eff492d4be" gracePeriod=2 Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.640902 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerStarted","Data":"114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b"} Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.643763 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerStarted","Data":"bdfb142a917171a592dc332e3483a6b02fe7724bde4a82bb642477bf0ec809e8"} Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.645876 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerStarted","Data":"bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1"} Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.648668 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerStarted","Data":"b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145"} Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.723675 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fdr2q" podStartSLOduration=5.574537828 podStartE2EDuration="1m13.723644091s" podCreationTimestamp="2025-12-01 10:01:59 +0000 UTC" firstStartedPulling="2025-12-01 10:02:03.062618553 +0000 UTC m=+170.571407590" lastFinishedPulling="2025-12-01 10:03:11.211724816 +0000 UTC m=+238.720513853" observedRunningTime="2025-12-01 10:03:12.71588233 +0000 UTC m=+240.224671367" watchObservedRunningTime="2025-12-01 10:03:12.723644091 +0000 UTC m=+240.232433128" Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.753726 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z5z9s" podStartSLOduration=4.29735079 podStartE2EDuration="1m15.753693212s" podCreationTimestamp="2025-12-01 10:01:57 +0000 UTC" firstStartedPulling="2025-12-01 10:01:59.760941326 +0000 UTC m=+167.269730363" lastFinishedPulling="2025-12-01 10:03:11.217283748 +0000 UTC m=+238.726072785" observedRunningTime="2025-12-01 10:03:12.746753197 +0000 UTC m=+240.255542254" watchObservedRunningTime="2025-12-01 10:03:12.753693212 +0000 UTC m=+240.262482249" Dec 01 10:03:12 crc kubenswrapper[4958]: I1201 10:03:12.775424 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2w7kl" podStartSLOduration=4.490448733 podStartE2EDuration="1m12.775372513s" podCreationTimestamp="2025-12-01 10:02:00 +0000 UTC" firstStartedPulling="2025-12-01 10:02:03.071872251 +0000 UTC m=+170.580661288" lastFinishedPulling="2025-12-01 10:03:11.356796031 +0000 UTC m=+238.865585068" observedRunningTime="2025-12-01 10:03:12.771295297 +0000 UTC m=+240.280084334" watchObservedRunningTime="2025-12-01 10:03:12.775372513 +0000 UTC m=+240.284161550" Dec 01 10:03:13 crc kubenswrapper[4958]: I1201 10:03:13.656738 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerStarted","Data":"956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21"} Dec 01 10:03:13 crc kubenswrapper[4958]: I1201 10:03:13.660437 4958 generic.go:334] "Generic (PLEG): container finished" podID="2ce95067-2a10-44bb-8179-b2663072dd42" containerID="dca97a26a1bed69b598626ac5cda44fa6126e749d36a6fcdfb5680eff492d4be" exitCode=0 Dec 01 10:03:13 crc kubenswrapper[4958]: I1201 10:03:13.660505 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerDied","Data":"dca97a26a1bed69b598626ac5cda44fa6126e749d36a6fcdfb5680eff492d4be"} Dec 01 10:03:13 crc kubenswrapper[4958]: I1201 10:03:13.684161 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j5m4k" podStartSLOduration=4.4275559 podStartE2EDuration="1m13.68413064s" podCreationTimestamp="2025-12-01 10:02:00 +0000 UTC" firstStartedPulling="2025-12-01 10:02:03.040971369 +0000 UTC m=+170.549760406" lastFinishedPulling="2025-12-01 10:03:12.297546109 +0000 UTC m=+239.806335146" observedRunningTime="2025-12-01 10:03:13.679881608 +0000 UTC m=+241.188670645" watchObservedRunningTime="2025-12-01 10:03:13.68413064 +0000 UTC m=+241.192919677" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.232519 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.325179 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-utilities\") pod \"2ce95067-2a10-44bb-8179-b2663072dd42\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.325326 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55lf\" (UniqueName: \"kubernetes.io/projected/2ce95067-2a10-44bb-8179-b2663072dd42-kube-api-access-r55lf\") pod \"2ce95067-2a10-44bb-8179-b2663072dd42\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.325392 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-catalog-content\") pod \"2ce95067-2a10-44bb-8179-b2663072dd42\" (UID: \"2ce95067-2a10-44bb-8179-b2663072dd42\") " Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.326519 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-utilities" (OuterVolumeSpecName: "utilities") pod "2ce95067-2a10-44bb-8179-b2663072dd42" (UID: "2ce95067-2a10-44bb-8179-b2663072dd42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.335087 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce95067-2a10-44bb-8179-b2663072dd42-kube-api-access-r55lf" (OuterVolumeSpecName: "kube-api-access-r55lf") pod "2ce95067-2a10-44bb-8179-b2663072dd42" (UID: "2ce95067-2a10-44bb-8179-b2663072dd42"). InnerVolumeSpecName "kube-api-access-r55lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.382372 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ce95067-2a10-44bb-8179-b2663072dd42" (UID: "2ce95067-2a10-44bb-8179-b2663072dd42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.427399 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.427453 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55lf\" (UniqueName: \"kubernetes.io/projected/2ce95067-2a10-44bb-8179-b2663072dd42-kube-api-access-r55lf\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.427468 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ce95067-2a10-44bb-8179-b2663072dd42-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.670525 4958 generic.go:334] "Generic (PLEG): container finished" podID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerID="114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b" exitCode=0 Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.670623 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerDied","Data":"114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b"} Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.673946 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzbln" event={"ID":"2ce95067-2a10-44bb-8179-b2663072dd42","Type":"ContainerDied","Data":"59cc80a77a4876798df10415703953e541799a50e3b5f4ce236eace1fa2a8134"} Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.674012 4958 scope.go:117] "RemoveContainer" containerID="dca97a26a1bed69b598626ac5cda44fa6126e749d36a6fcdfb5680eff492d4be" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.674043 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzbln" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.696325 4958 scope.go:117] "RemoveContainer" containerID="88067881ad5f2d1a6e14d8908f6a5cc4f4564bf37c0517b873709e7fce64d75c" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.724946 4958 scope.go:117] "RemoveContainer" containerID="17c58ae59eebf353bf86a92748cbf9e98fd971b5ab703f2f6037bba64af4164a" Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.764581 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mzbln"] Dec 01 10:03:14 crc kubenswrapper[4958]: I1201 10:03:14.771158 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mzbln"] Dec 01 10:03:15 crc kubenswrapper[4958]: I1201 10:03:15.818959 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" path="/var/lib/kubelet/pods/2ce95067-2a10-44bb-8179-b2663072dd42/volumes" Dec 01 10:03:17 crc kubenswrapper[4958]: I1201 10:03:17.913911 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:03:17 crc kubenswrapper[4958]: I1201 10:03:17.914006 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:03:17 crc kubenswrapper[4958]: I1201 10:03:17.966303 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:03:18 crc kubenswrapper[4958]: I1201 10:03:18.738468 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:03:19 crc kubenswrapper[4958]: I1201 10:03:19.706617 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerStarted","Data":"559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f"} Dec 01 10:03:19 crc kubenswrapper[4958]: I1201 10:03:19.728111 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqfvf" podStartSLOduration=4.352901883 podStartE2EDuration="1m22.72808414s" podCreationTimestamp="2025-12-01 10:01:57 +0000 UTC" firstStartedPulling="2025-12-01 10:02:00.93611387 +0000 UTC m=+168.444902907" lastFinishedPulling="2025-12-01 10:03:19.311296127 +0000 UTC m=+246.820085164" observedRunningTime="2025-12-01 10:03:19.724793378 +0000 UTC m=+247.233582415" watchObservedRunningTime="2025-12-01 10:03:19.72808414 +0000 UTC m=+247.236873177" Dec 01 10:03:20 crc kubenswrapper[4958]: I1201 10:03:20.354566 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:03:20 crc kubenswrapper[4958]: I1201 10:03:20.354660 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:03:20 crc kubenswrapper[4958]: I1201 10:03:20.409572 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:03:20 crc kubenswrapper[4958]: I1201 10:03:20.801562 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:03:20 crc kubenswrapper[4958]: I1201 10:03:20.832820 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:03:20 crc kubenswrapper[4958]: I1201 10:03:20.880162 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.090789 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.090871 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.145808 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.296946 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.297029 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.460339 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.761303 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:03:21 crc kubenswrapper[4958]: I1201 10:03:21.943404 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:03:22 crc kubenswrapper[4958]: I1201 10:03:22.838069 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztnzr"] Dec 01 10:03:22 crc kubenswrapper[4958]: I1201 10:03:22.838434 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ztnzr" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="registry-server" containerID="cri-o://0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6" gracePeriod=2 Dec 01 10:03:22 crc kubenswrapper[4958]: I1201 10:03:22.923586 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" podUID="416bbcbb-fc9f-489b-a1ac-eea205ce9152" containerName="oauth-openshift" containerID="cri-o://62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836" gracePeriod=15 Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.345917 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.403274 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-catalog-content\") pod \"a167bce5-ac1c-4cba-b065-5458a7840cd2\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.403407 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-utilities\") pod \"a167bce5-ac1c-4cba-b065-5458a7840cd2\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.403555 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxkpl\" (UniqueName: \"kubernetes.io/projected/a167bce5-ac1c-4cba-b065-5458a7840cd2-kube-api-access-pxkpl\") pod \"a167bce5-ac1c-4cba-b065-5458a7840cd2\" (UID: \"a167bce5-ac1c-4cba-b065-5458a7840cd2\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.404579 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-utilities" (OuterVolumeSpecName: "utilities") pod "a167bce5-ac1c-4cba-b065-5458a7840cd2" (UID: "a167bce5-ac1c-4cba-b065-5458a7840cd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.411899 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a167bce5-ac1c-4cba-b065-5458a7840cd2-kube-api-access-pxkpl" (OuterVolumeSpecName: "kube-api-access-pxkpl") pod "a167bce5-ac1c-4cba-b065-5458a7840cd2" (UID: "a167bce5-ac1c-4cba-b065-5458a7840cd2"). InnerVolumeSpecName "kube-api-access-pxkpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.422238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a167bce5-ac1c-4cba-b065-5458a7840cd2" (UID: "a167bce5-ac1c-4cba-b065-5458a7840cd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.445116 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505364 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8rxx\" (UniqueName: \"kubernetes.io/projected/416bbcbb-fc9f-489b-a1ac-eea205ce9152-kube-api-access-p8rxx\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505431 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-login\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505461 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-service-ca\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505553 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-provider-selection\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505587 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-router-certs\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505644 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-trusted-ca-bundle\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505665 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-idp-0-file-data\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505696 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-serving-cert\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505725 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-dir\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505762 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-ocp-branding-template\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505799 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-policies\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505830 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-cliconfig\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505892 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-session\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.505919 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-error\") pod \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\" (UID: \"416bbcbb-fc9f-489b-a1ac-eea205ce9152\") " Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.506211 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxkpl\" (UniqueName: \"kubernetes.io/projected/a167bce5-ac1c-4cba-b065-5458a7840cd2-kube-api-access-pxkpl\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.506225 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.506236 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a167bce5-ac1c-4cba-b065-5458a7840cd2-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.507702 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.508054 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.508050 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.508231 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.508704 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.512139 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.512744 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.513000 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.513437 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/416bbcbb-fc9f-489b-a1ac-eea205ce9152-kube-api-access-p8rxx" (OuterVolumeSpecName: "kube-api-access-p8rxx") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "kube-api-access-p8rxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.513533 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.513766 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.514387 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.514644 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.515251 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "416bbcbb-fc9f-489b-a1ac-eea205ce9152" (UID: "416bbcbb-fc9f-489b-a1ac-eea205ce9152"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607521 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607570 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607586 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607598 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607611 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607625 4958 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607638 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607671 4958 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607681 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607693 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607704 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607716 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8rxx\" (UniqueName: \"kubernetes.io/projected/416bbcbb-fc9f-489b-a1ac-eea205ce9152-kube-api-access-p8rxx\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607727 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.607736 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/416bbcbb-fc9f-489b-a1ac-eea205ce9152-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.739491 4958 generic.go:334] "Generic (PLEG): container finished" podID="416bbcbb-fc9f-489b-a1ac-eea205ce9152" containerID="62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836" exitCode=0 Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.739535 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.739603 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" event={"ID":"416bbcbb-fc9f-489b-a1ac-eea205ce9152","Type":"ContainerDied","Data":"62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836"} Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.739642 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tp8nm" event={"ID":"416bbcbb-fc9f-489b-a1ac-eea205ce9152","Type":"ContainerDied","Data":"d9efc234b49de98d5c3704b4fac9d42f47cbc666b0a3d8416e5d0bf392dc10ca"} Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.739665 4958 scope.go:117] "RemoveContainer" containerID="62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.743110 4958 generic.go:334] "Generic (PLEG): container finished" podID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerID="0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6" exitCode=0 Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.743149 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerDied","Data":"0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6"} Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.743171 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztnzr" event={"ID":"a167bce5-ac1c-4cba-b065-5458a7840cd2","Type":"ContainerDied","Data":"ee318a882dd05c359663798f1b248b443b3359cbcd269e57ff400766acd45860"} Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.743222 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztnzr" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.773140 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tp8nm"] Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.779829 4958 scope.go:117] "RemoveContainer" containerID="62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836" Dec 01 10:03:24 crc kubenswrapper[4958]: E1201 10:03:24.780493 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836\": container with ID starting with 62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836 not found: ID does not exist" containerID="62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.780662 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836"} err="failed to get container status \"62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836\": rpc error: code = NotFound desc = could not find container \"62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836\": container with ID starting with 62d7f822f177647eb75dea7e58a0166d02bd6e86721f68cfab45c5081445a836 not found: ID does not exist" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.780779 4958 scope.go:117] "RemoveContainer" containerID="0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.782203 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tp8nm"] Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.791381 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztnzr"] Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.794233 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztnzr"] Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.801195 4958 scope.go:117] "RemoveContainer" containerID="10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.816996 4958 scope.go:117] "RemoveContainer" containerID="d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.833206 4958 scope.go:117] "RemoveContainer" containerID="0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6" Dec 01 10:03:24 crc kubenswrapper[4958]: E1201 10:03:24.833929 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6\": container with ID starting with 0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6 not found: ID does not exist" containerID="0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.833985 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6"} err="failed to get container status \"0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6\": rpc error: code = NotFound desc = could not find container \"0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6\": container with ID starting with 0f1c30e892fe63830e2f48cec88d6ed37482fe7eb96661f0cc4e8c6ec31f4ab6 not found: ID does not exist" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.834024 4958 scope.go:117] "RemoveContainer" containerID="10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3" Dec 01 10:03:24 crc kubenswrapper[4958]: E1201 10:03:24.834446 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3\": container with ID starting with 10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3 not found: ID does not exist" containerID="10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.834493 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3"} err="failed to get container status \"10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3\": rpc error: code = NotFound desc = could not find container \"10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3\": container with ID starting with 10b182e6d1d346eb70feb51a8470e8cd696f5098ff823be29d2b4eb71a9fcda3 not found: ID does not exist" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.834510 4958 scope.go:117] "RemoveContainer" containerID="d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057" Dec 01 10:03:24 crc kubenswrapper[4958]: E1201 10:03:24.834920 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057\": container with ID starting with d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057 not found: ID does not exist" containerID="d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057" Dec 01 10:03:24 crc kubenswrapper[4958]: I1201 10:03:24.834963 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057"} err="failed to get container status \"d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057\": rpc error: code = NotFound desc = could not find container \"d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057\": container with ID starting with d6bcb91ca71613663ac1cf51d879c32f9d428b7dc98c3d2accc5241a19f72057 not found: ID does not exist" Dec 01 10:03:25 crc kubenswrapper[4958]: I1201 10:03:25.241429 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2w7kl"] Dec 01 10:03:25 crc kubenswrapper[4958]: I1201 10:03:25.241781 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2w7kl" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="registry-server" containerID="cri-o://bdfb142a917171a592dc332e3483a6b02fe7724bde4a82bb642477bf0ec809e8" gracePeriod=2 Dec 01 10:03:25 crc kubenswrapper[4958]: I1201 10:03:25.806412 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="416bbcbb-fc9f-489b-a1ac-eea205ce9152" path="/var/lib/kubelet/pods/416bbcbb-fc9f-489b-a1ac-eea205ce9152/volumes" Dec 01 10:03:25 crc kubenswrapper[4958]: I1201 10:03:25.807379 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" path="/var/lib/kubelet/pods/a167bce5-ac1c-4cba-b065-5458a7840cd2/volumes" Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.757537 4958 generic.go:334] "Generic (PLEG): container finished" podID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerID="bdfb142a917171a592dc332e3483a6b02fe7724bde4a82bb642477bf0ec809e8" exitCode=0 Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.757592 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerDied","Data":"bdfb142a917171a592dc332e3483a6b02fe7724bde4a82bb642477bf0ec809e8"} Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.873601 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.953255 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-utilities\") pod \"15c34271-3e36-4e77-bc38-f06b06d499f6\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.953401 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfhst\" (UniqueName: \"kubernetes.io/projected/15c34271-3e36-4e77-bc38-f06b06d499f6-kube-api-access-vfhst\") pod \"15c34271-3e36-4e77-bc38-f06b06d499f6\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.953542 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-catalog-content\") pod \"15c34271-3e36-4e77-bc38-f06b06d499f6\" (UID: \"15c34271-3e36-4e77-bc38-f06b06d499f6\") " Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.954370 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-utilities" (OuterVolumeSpecName: "utilities") pod "15c34271-3e36-4e77-bc38-f06b06d499f6" (UID: "15c34271-3e36-4e77-bc38-f06b06d499f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:26 crc kubenswrapper[4958]: I1201 10:03:26.958588 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15c34271-3e36-4e77-bc38-f06b06d499f6-kube-api-access-vfhst" (OuterVolumeSpecName: "kube-api-access-vfhst") pod "15c34271-3e36-4e77-bc38-f06b06d499f6" (UID: "15c34271-3e36-4e77-bc38-f06b06d499f6"). InnerVolumeSpecName "kube-api-access-vfhst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.055312 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfhst\" (UniqueName: \"kubernetes.io/projected/15c34271-3e36-4e77-bc38-f06b06d499f6-kube-api-access-vfhst\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.055696 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.062335 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15c34271-3e36-4e77-bc38-f06b06d499f6" (UID: "15c34271-3e36-4e77-bc38-f06b06d499f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.156952 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15c34271-3e36-4e77-bc38-f06b06d499f6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.375044 4958 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.375956 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.375986 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376013 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="extract-content" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376023 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="extract-content" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376037 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="extract-utilities" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376047 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="extract-utilities" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376064 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="416bbcbb-fc9f-489b-a1ac-eea205ce9152" containerName="oauth-openshift" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376072 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="416bbcbb-fc9f-489b-a1ac-eea205ce9152" containerName="oauth-openshift" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376086 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="extract-utilities" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376095 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="extract-utilities" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376111 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="extract-content" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376122 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="extract-content" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376138 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009f7729-a27a-415b-a008-cc328985af59" containerName="pruner" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376151 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="009f7729-a27a-415b-a008-cc328985af59" containerName="pruner" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376169 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="extract-utilities" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376179 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="extract-utilities" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376188 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="extract-content" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376197 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="extract-content" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376209 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376218 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.376229 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376238 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376438 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376462 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="416bbcbb-fc9f-489b-a1ac-eea205ce9152" containerName="oauth-openshift" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376473 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="009f7729-a27a-415b-a008-cc328985af59" containerName="pruner" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376490 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce95067-2a10-44bb-8179-b2663072dd42" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.376501 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a167bce5-ac1c-4cba-b065-5458a7840cd2" containerName="registry-server" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377090 4958 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377123 4958 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377292 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377303 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377319 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377329 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377340 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377350 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377361 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377370 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377388 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377398 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377411 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377420 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377431 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377442 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377563 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377574 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377587 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377599 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377613 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377630 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.377755 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377766 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.377928 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.378220 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.378343 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff" gracePeriod=15 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.378447 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524" gracePeriod=15 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.378538 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476" gracePeriod=15 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.378497 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53" gracePeriod=15 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.378335 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f" gracePeriod=15 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.384542 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.421292 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.471816 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.471913 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.471968 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.472005 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.472037 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.472071 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.472123 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.472177 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573505 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573576 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573623 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573658 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573657 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573703 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573735 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573755 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573761 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573797 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573824 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573884 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573759 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573929 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.573988 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.574045 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.717943 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:03:27 crc kubenswrapper[4958]: W1201 10:03:27.740425 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-aaf0d972eb0190526787793c597f3c69866875a683efa9c2d302168b25e337be WatchSource:0}: Error finding container aaf0d972eb0190526787793c597f3c69866875a683efa9c2d302168b25e337be: Status 404 returned error can't find the container with id aaf0d972eb0190526787793c597f3c69866875a683efa9c2d302168b25e337be Dec 01 10:03:27 crc kubenswrapper[4958]: E1201 10:03:27.743697 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.216:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0f429b1029e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:03:27.742888424 +0000 UTC m=+255.251677461,LastTimestamp:2025-12-01 10:03:27.742888424 +0000 UTC m=+255.251677461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.766052 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2w7kl" event={"ID":"15c34271-3e36-4e77-bc38-f06b06d499f6","Type":"ContainerDied","Data":"797e10a77307e8b435bc08565d24be3647153a948e8f7ea10f850ddf8cb03ef4"} Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.766124 4958 scope.go:117] "RemoveContainer" containerID="bdfb142a917171a592dc332e3483a6b02fe7724bde4a82bb642477bf0ec809e8" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.766282 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2w7kl" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.771413 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.772277 4958 generic.go:334] "Generic (PLEG): container finished" podID="5867f258-1174-4cc1-b035-a68f57724d88" containerID="602990710fc705034fe1a384369a94c7b603e2e345135fc88c3757f40732cab0" exitCode=0 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.772372 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5867f258-1174-4cc1-b035-a68f57724d88","Type":"ContainerDied","Data":"602990710fc705034fe1a384369a94c7b603e2e345135fc88c3757f40732cab0"} Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.772368 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.773072 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.773387 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.773928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"aaf0d972eb0190526787793c597f3c69866875a683efa9c2d302168b25e337be"} Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.773928 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.780956 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.781491 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.781747 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.782129 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.782809 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.783463 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff" exitCode=0 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.783488 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53" exitCode=0 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.783499 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524" exitCode=0 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.783511 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476" exitCode=2 Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.789072 4958 scope.go:117] "RemoveContainer" containerID="b949e2826afc0f021e861409a564bf800db15eb9579104b9afe50e01b4c1dd5c" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.811054 4958 scope.go:117] "RemoveContainer" containerID="ef7669644e0dfa31000d4d392d3d75c47c67f277ae1e21f0f6639d71bcbd6171" Dec 01 10:03:27 crc kubenswrapper[4958]: I1201 10:03:27.832276 4958 scope.go:117] "RemoveContainer" containerID="60ca25f40b8a995ae427553657f995596f5744c84ecaaa56cebd742eee24f3a9" Dec 01 10:03:28 crc kubenswrapper[4958]: E1201 10:03:28.029409 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:28Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:817c0c66f8b4782d1b5ae27cc60e2a141dcf3b7f96fc730ef74dd043405a32a9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:87e70b36dba63685af2ddff6719cdfcca2cd4cbbf8efe24cd4c86b9c2ecb4f8f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1606452067},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d0d3fce260dd5b7e90c22ae8dc5f447b01e6bb9e798d0bef999ca7abcdc664c0\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e2569f586510e07a900470ff7716df01d9a339a305ce9148d93e2a2d7a4cafe8\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202665305},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8f5aca4ec520648cd22b9a604afbdd80b85ef21750dda9d6da120ea09519f404\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:bec2ed7cd9ed15a19afd3d03ce96040c9df456946aafe4fc67bb7003d2eab164\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201206094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: E1201 10:03:28.031421 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: E1201 10:03:28.032179 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: E1201 10:03:28.032472 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: E1201 10:03:28.032747 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: E1201 10:03:28.032770 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.308348 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.308442 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.362241 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.363395 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.363999 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.364575 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.364940 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.793058 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d"} Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.794185 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.794882 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.795296 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.795761 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.796744 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.848086 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.849050 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.849581 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.850276 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:28 crc kubenswrapper[4958]: I1201 10:03:28.850607 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.080406 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.081191 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.081495 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.081768 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.091009 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.198078 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-var-lock\") pod \"5867f258-1174-4cc1-b035-a68f57724d88\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.198190 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5867f258-1174-4cc1-b035-a68f57724d88-kube-api-access\") pod \"5867f258-1174-4cc1-b035-a68f57724d88\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.198250 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-var-lock" (OuterVolumeSpecName: "var-lock") pod "5867f258-1174-4cc1-b035-a68f57724d88" (UID: "5867f258-1174-4cc1-b035-a68f57724d88"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.198296 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-kubelet-dir\") pod \"5867f258-1174-4cc1-b035-a68f57724d88\" (UID: \"5867f258-1174-4cc1-b035-a68f57724d88\") " Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.198570 4958 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.198602 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5867f258-1174-4cc1-b035-a68f57724d88" (UID: "5867f258-1174-4cc1-b035-a68f57724d88"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.204764 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5867f258-1174-4cc1-b035-a68f57724d88-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5867f258-1174-4cc1-b035-a68f57724d88" (UID: "5867f258-1174-4cc1-b035-a68f57724d88"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.214963 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.216:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0f429b1029e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:03:27.742888424 +0000 UTC m=+255.251677461,LastTimestamp:2025-12-01 10:03:27.742888424 +0000 UTC m=+255.251677461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.300678 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5867f258-1174-4cc1-b035-a68f57724d88-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.300757 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5867f258-1174-4cc1-b035-a68f57724d88-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.807451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5867f258-1174-4cc1-b035-a68f57724d88","Type":"ContainerDied","Data":"495b4e54bd384ff76d7386a96a24682b7b362ab4d6498c1ba48ff75eec0fd01a"} Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.808128 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495b4e54bd384ff76d7386a96a24682b7b362ab4d6498c1ba48ff75eec0fd01a" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.807920 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.810115 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.811259 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.811556 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.812116 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.812466 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f" exitCode=0 Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.812521 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.812579 4958 scope.go:117] "RemoveContainer" containerID="ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.813184 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.813490 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.814346 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.823065 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.823431 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.823770 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.824154 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.824668 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.830171 4958 scope.go:117] "RemoveContainer" containerID="24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.843542 4958 scope.go:117] "RemoveContainer" containerID="2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.856453 4958 scope.go:117] "RemoveContainer" containerID="a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.871432 4958 scope.go:117] "RemoveContainer" containerID="1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.890389 4958 scope.go:117] "RemoveContainer" containerID="9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.908794 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.908938 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.908966 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.909014 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.909088 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.909244 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.909282 4958 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.909381 4958 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.914978 4958 scope.go:117] "RemoveContainer" containerID="ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.915739 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\": container with ID starting with ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff not found: ID does not exist" containerID="ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.915789 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff"} err="failed to get container status \"ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\": rpc error: code = NotFound desc = could not find container \"ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff\": container with ID starting with ad5aaf8735669bfb790fb19b6808b60fb8de9a34563c0a3df4c9a2eb33b238ff not found: ID does not exist" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.915833 4958 scope.go:117] "RemoveContainer" containerID="24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.916251 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\": container with ID starting with 24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53 not found: ID does not exist" containerID="24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.916302 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53"} err="failed to get container status \"24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\": rpc error: code = NotFound desc = could not find container \"24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53\": container with ID starting with 24c2ba059455e4bfc156f424747f12c63b34cbc47c962c544c851a3069266a53 not found: ID does not exist" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.916344 4958 scope.go:117] "RemoveContainer" containerID="2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.916960 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\": container with ID starting with 2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524 not found: ID does not exist" containerID="2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.917000 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524"} err="failed to get container status \"2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\": rpc error: code = NotFound desc = could not find container \"2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524\": container with ID starting with 2c0f7102cf7dbd5a8b71af0a45e8cd28a8e2126aa2b0f1261a881d9f104b6524 not found: ID does not exist" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.917028 4958 scope.go:117] "RemoveContainer" containerID="a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.917333 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\": container with ID starting with a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476 not found: ID does not exist" containerID="a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.917369 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476"} err="failed to get container status \"a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\": rpc error: code = NotFound desc = could not find container \"a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476\": container with ID starting with a5c0a170090e78755294a10bf5cfd3bfb03468d0cbdb311c88bcfa4c64729476 not found: ID does not exist" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.917387 4958 scope.go:117] "RemoveContainer" containerID="1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.917824 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\": container with ID starting with 1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f not found: ID does not exist" containerID="1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.917876 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f"} err="failed to get container status \"1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\": rpc error: code = NotFound desc = could not find container \"1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f\": container with ID starting with 1c2f3f25e9e2b4e220f8cfb6eb165e32224a4375b9333f79b25ea0f0b317590f not found: ID does not exist" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.917897 4958 scope.go:117] "RemoveContainer" containerID="9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a" Dec 01 10:03:29 crc kubenswrapper[4958]: E1201 10:03:29.918443 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\": container with ID starting with 9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a not found: ID does not exist" containerID="9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a" Dec 01 10:03:29 crc kubenswrapper[4958]: I1201 10:03:29.918489 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a"} err="failed to get container status \"9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\": rpc error: code = NotFound desc = could not find container \"9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a\": container with ID starting with 9f6175ca833fae7d8180d96aff44ec1447601e173e1d87d16cd14b626a0c274a not found: ID does not exist" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.011088 4958 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.820272 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.834939 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.835397 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.835686 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.835919 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:30 crc kubenswrapper[4958]: I1201 10:03:30.836143 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:31 crc kubenswrapper[4958]: I1201 10:03:31.805307 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 10:03:31 crc kubenswrapper[4958]: E1201 10:03:31.876979 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:31 crc kubenswrapper[4958]: E1201 10:03:31.877725 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:31 crc kubenswrapper[4958]: E1201 10:03:31.878224 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:31 crc kubenswrapper[4958]: E1201 10:03:31.878554 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:31 crc kubenswrapper[4958]: E1201 10:03:31.878878 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:31 crc kubenswrapper[4958]: I1201 10:03:31.878924 4958 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 10:03:31 crc kubenswrapper[4958]: E1201 10:03:31.879238 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="200ms" Dec 01 10:03:32 crc kubenswrapper[4958]: E1201 10:03:32.081034 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="400ms" Dec 01 10:03:32 crc kubenswrapper[4958]: E1201 10:03:32.482075 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="800ms" Dec 01 10:03:33 crc kubenswrapper[4958]: E1201 10:03:33.283516 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="1.6s" Dec 01 10:03:33 crc kubenswrapper[4958]: I1201 10:03:33.800041 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:33 crc kubenswrapper[4958]: I1201 10:03:33.800412 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:33 crc kubenswrapper[4958]: I1201 10:03:33.801012 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:33 crc kubenswrapper[4958]: I1201 10:03:33.801575 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:34 crc kubenswrapper[4958]: E1201 10:03:34.884202 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="3.2s" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.086311 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" interval="6.4s" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.113168 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:38Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:38Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:38Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T10:03:38Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:817c0c66f8b4782d1b5ae27cc60e2a141dcf3b7f96fc730ef74dd043405a32a9\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:87e70b36dba63685af2ddff6719cdfcca2cd4cbbf8efe24cd4c86b9c2ecb4f8f\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1606452067},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d0d3fce260dd5b7e90c22ae8dc5f447b01e6bb9e798d0bef999ca7abcdc664c0\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e2569f586510e07a900470ff7716df01d9a339a305ce9148d93e2a2d7a4cafe8\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1202665305},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8f5aca4ec520648cd22b9a604afbdd80b85ef21750dda9d6da120ea09519f404\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:bec2ed7cd9ed15a19afd3d03ce96040c9df456946aafe4fc67bb7003d2eab164\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201206094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.113628 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.113898 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.114317 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.114956 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:38 crc kubenswrapper[4958]: E1201 10:03:38.114991 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 10:03:39 crc kubenswrapper[4958]: E1201 10:03:39.216150 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.216:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0f429b1029e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 10:03:27.742888424 +0000 UTC m=+255.251677461,LastTimestamp:2025-12-01 10:03:27.742888424 +0000 UTC m=+255.251677461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.895442 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.895868 4958 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd" exitCode=1 Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.895908 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd"} Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.896377 4958 scope.go:117] "RemoveContainer" containerID="9d097898466f1543088df5421381e6b479ba9ac222c6e498f0344c2be08750bd" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.896888 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.897401 4958 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.898515 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.899414 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:41 crc kubenswrapper[4958]: I1201 10:03:41.900117 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.796455 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.798698 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.799497 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.800083 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.800589 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.801035 4958 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.813193 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.813278 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:42 crc kubenswrapper[4958]: E1201 10:03:42.813968 4958 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.814973 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.908474 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.908610 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1955fdc2fcb3b863bc9e969a79b066acea72c336067a83eb1c0388805d5d71d9"} Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.909825 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.910385 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.910612 4958 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.911020 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.911055 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d1e3c54833c57f3308a591ac6dd73a41f20c962a451ef058cf264ccb093f016f"} Dec 01 10:03:42 crc kubenswrapper[4958]: I1201 10:03:42.911779 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.802062 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.802620 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.803247 4958 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.803751 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.804063 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.804380 4958 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: E1201 10:03:43.859810 4958 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.216:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" volumeName="registry-storage" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.922279 4958 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="09361bffa9cac1c31ba5679eef42d9d0384b07db353f9d27763cbfe6313061b1" exitCode=0 Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.922441 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"09361bffa9cac1c31ba5679eef42d9d0384b07db353f9d27763cbfe6313061b1"} Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.922761 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.922794 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.923310 4958 status_manager.go:851] "Failed to get status for pod" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" pod="openshift-marketplace/community-operators-nqfvf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-nqfvf\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: E1201 10:03:43.923560 4958 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.923801 4958 status_manager.go:851] "Failed to get status for pod" podUID="5867f258-1174-4cc1-b035-a68f57724d88" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.924381 4958 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.924670 4958 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.924954 4958 status_manager.go:851] "Failed to get status for pod" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" pod="openshift-marketplace/redhat-operators-2w7kl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-2w7kl\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:43 crc kubenswrapper[4958]: I1201 10:03:43.925276 4958 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.216:6443: connect: connection refused" Dec 01 10:03:44 crc kubenswrapper[4958]: I1201 10:03:44.941945 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5a2ade57da81cfe4d648783bcd700e57b6f7b18c554c421f63f5c0eca9354b1"} Dec 01 10:03:44 crc kubenswrapper[4958]: I1201 10:03:44.942438 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ce1c047afb4b53b80f0633d457ee5aadedf090c313bd928772788ab51f96526e"} Dec 01 10:03:44 crc kubenswrapper[4958]: I1201 10:03:44.942453 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3171e380497be27d36f4bc7016b48d38a686b7e2e48493e97d3e3734955df758"} Dec 01 10:03:45 crc kubenswrapper[4958]: I1201 10:03:45.950950 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"60a9a0a70ed1edd15030f8a6e8e3542140fa5e2e28a9031234351772ff75ca2f"} Dec 01 10:03:45 crc kubenswrapper[4958]: I1201 10:03:45.951011 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"20e248a84e97d7e8a4e3b1232436c7e1efa4eef6315686ea35f9deac16761e6a"} Dec 01 10:03:45 crc kubenswrapper[4958]: I1201 10:03:45.951312 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:45 crc kubenswrapper[4958]: I1201 10:03:45.952106 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:45 crc kubenswrapper[4958]: I1201 10:03:45.952132 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:47 crc kubenswrapper[4958]: I1201 10:03:47.301326 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:03:47 crc kubenswrapper[4958]: I1201 10:03:47.815885 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:47 crc kubenswrapper[4958]: I1201 10:03:47.815951 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:47 crc kubenswrapper[4958]: I1201 10:03:47.822689 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:48 crc kubenswrapper[4958]: I1201 10:03:48.083176 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:03:48 crc kubenswrapper[4958]: I1201 10:03:48.083767 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 10:03:48 crc kubenswrapper[4958]: I1201 10:03:48.083888 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 10:03:50 crc kubenswrapper[4958]: I1201 10:03:50.976444 4958 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:51 crc kubenswrapper[4958]: I1201 10:03:51.142117 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9cc726d-46be-47c8-ae00-bfbea85c3072" Dec 01 10:03:51 crc kubenswrapper[4958]: I1201 10:03:51.991138 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:51 crc kubenswrapper[4958]: I1201 10:03:51.991183 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:51 crc kubenswrapper[4958]: I1201 10:03:51.997624 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:03:51 crc kubenswrapper[4958]: I1201 10:03:51.999021 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9cc726d-46be-47c8-ae00-bfbea85c3072" Dec 01 10:03:52 crc kubenswrapper[4958]: I1201 10:03:52.999083 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:52 crc kubenswrapper[4958]: I1201 10:03:52.999568 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:03:53 crc kubenswrapper[4958]: I1201 10:03:53.004709 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9cc726d-46be-47c8-ae00-bfbea85c3072" Dec 01 10:03:58 crc kubenswrapper[4958]: I1201 10:03:58.083996 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 10:03:58 crc kubenswrapper[4958]: I1201 10:03:58.084671 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.563697 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.584983 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.603424 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.671202 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.739819 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.868936 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 10:04:01 crc kubenswrapper[4958]: I1201 10:04:01.989211 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 10:04:02 crc kubenswrapper[4958]: I1201 10:04:02.308462 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 10:04:02 crc kubenswrapper[4958]: I1201 10:04:02.438584 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 10:04:02 crc kubenswrapper[4958]: I1201 10:04:02.516589 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 10:04:02 crc kubenswrapper[4958]: I1201 10:04:02.607548 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.428653 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.734025 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.802783 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.825831 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.837469 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.888951 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.888951 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 10:04:03 crc kubenswrapper[4958]: I1201 10:04:03.894006 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.028059 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.106389 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.143612 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.174385 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.322311 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.405236 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.440944 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.442484 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.469411 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.496104 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.542278 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.708312 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.821663 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.935450 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 10:04:04 crc kubenswrapper[4958]: I1201 10:04:04.977468 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.072063 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.081343 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.143018 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.193640 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.197003 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.254028 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.317611 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.392005 4958 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.413905 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.419930 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.433681 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.572888 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.679683 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.681211 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.813055 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.929763 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:04:05 crc kubenswrapper[4958]: I1201 10:04:05.935660 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.089944 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.121603 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.197088 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.206316 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.486654 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.522311 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.590971 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.607719 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.688509 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.694373 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.777814 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.782199 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.862290 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.886334 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 10:04:06 crc kubenswrapper[4958]: I1201 10:04:06.988323 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.024703 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.113863 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.116991 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.152578 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.274343 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.305753 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.318925 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.382727 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.384977 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.404489 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.441062 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.502592 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.542549 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.552701 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.580383 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.602971 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.611007 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.630023 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.659129 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.676292 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.676398 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.710887 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.747346 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.774294 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.796592 4958 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.869230 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.873429 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 10:04:07 crc kubenswrapper[4958]: I1201 10:04:07.918335 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.088589 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.093906 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.113385 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.355014 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.405488 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.412046 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.429447 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.591357 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.593269 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.593756 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.596776 4958 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.598151 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.598119335 podStartE2EDuration="41.598119335s" podCreationTimestamp="2025-12-01 10:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:03:51.136462225 +0000 UTC m=+278.645251262" watchObservedRunningTime="2025-12-01 10:04:08.598119335 +0000 UTC m=+296.106908372" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.603537 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-operators-2w7kl"] Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.603642 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-9d745f8b5-xhrxp"] Dec 01 10:04:08 crc kubenswrapper[4958]: E1201 10:04:08.604009 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5867f258-1174-4cc1-b035-a68f57724d88" containerName="installer" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.604035 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5867f258-1174-4cc1-b035-a68f57724d88" containerName="installer" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.604168 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5867f258-1174-4cc1-b035-a68f57724d88" containerName="installer" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.604169 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.604223 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a19e0dac-64a6-4b41-80e7-cc90db58399d" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.604872 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.610120 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.610369 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.610620 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.610698 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.610917 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.610954 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.611416 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.611502 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.611419 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.612312 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.613267 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.614125 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.616594 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.617881 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.626583 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.627159 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.664234 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.666017 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.665995954 podStartE2EDuration="18.665995954s" podCreationTimestamp="2025-12-01 10:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:04:08.660737792 +0000 UTC m=+296.169526849" watchObservedRunningTime="2025-12-01 10:04:08.665995954 +0000 UTC m=+296.174784991" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701385 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701479 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-audit-policies\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701525 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-audit-dir\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701555 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-login\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701583 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701616 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701654 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2zcx\" (UniqueName: \"kubernetes.io/projected/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-kube-api-access-l2zcx\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701690 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701742 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-session\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701767 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-error\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701819 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701856 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.701879 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.794549 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.806810 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.806930 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-session\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.806965 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-error\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808255 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808320 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808341 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808402 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808444 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-audit-policies\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-audit-dir\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808494 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-login\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808517 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808536 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808577 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2zcx\" (UniqueName: \"kubernetes.io/projected/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-kube-api-access-l2zcx\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.808612 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-audit-dir\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.809480 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-audit-policies\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.809710 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.810264 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.813128 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.815745 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.815873 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.816779 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-error\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.817208 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-session\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.817596 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.817633 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.817713 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-login\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.818151 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.822306 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.829019 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2zcx\" (UniqueName: \"kubernetes.io/projected/3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a-kube-api-access-l2zcx\") pod \"oauth-openshift-9d745f8b5-xhrxp\" (UID: \"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a\") " pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.909120 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 10:04:08 crc kubenswrapper[4958]: I1201 10:04:08.935914 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.028315 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.160816 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.170498 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.202360 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.243498 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.272109 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.381980 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.409735 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.456391 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.483884 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.484013 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.569673 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.585532 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.641282 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.690888 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.774035 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.808607 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15c34271-3e36-4e77-bc38-f06b06d499f6" path="/var/lib/kubelet/pods/15c34271-3e36-4e77-bc38-f06b06d499f6/volumes" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.854577 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.855447 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.943602 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 10:04:09 crc kubenswrapper[4958]: I1201 10:04:09.973020 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.062082 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.079758 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.107523 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.234284 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.242610 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.280323 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.323744 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.329782 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.438225 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9d745f8b5-xhrxp"] Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.475243 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.535768 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.538012 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.606582 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.666787 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.667055 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.816258 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9d745f8b5-xhrxp"] Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.854451 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.980005 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.981775 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 10:04:10 crc kubenswrapper[4958]: I1201 10:04:10.991366 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.063566 4958 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.109372 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerStarted","Data":"875f4d741dd5a7fd1b32cd1ed3b6a306547aab2c922c76601c0a3cbf70d55869"} Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.127062 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.234723 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.238141 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.256128 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.305246 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.343536 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.354360 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.362667 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.398341 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.546638 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.655340 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.687804 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.707684 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.745411 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.745593 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.803778 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.917420 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 10:04:11 crc kubenswrapper[4958]: I1201 10:04:11.961143 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.092656 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.111205 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.117133 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.118061 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/0.log" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.118108 4958 generic.go:334] "Generic (PLEG): container finished" podID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" containerID="af7e5a10d822ece1369f0759691e960df648c7149e7b71c22f2423660f0dc347" exitCode=255 Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.118150 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerDied","Data":"af7e5a10d822ece1369f0759691e960df648c7149e7b71c22f2423660f0dc347"} Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.118790 4958 scope.go:117] "RemoveContainer" containerID="af7e5a10d822ece1369f0759691e960df648c7149e7b71c22f2423660f0dc347" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.162513 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.170773 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.188619 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.247015 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.289378 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.344645 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.432728 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.439307 4958 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.439697 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d" gracePeriod=5 Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.448579 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.485621 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.511079 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.592796 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.655751 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 10:04:12 crc kubenswrapper[4958]: I1201 10:04:12.795908 4958 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.125502 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/0.log" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.125595 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerStarted","Data":"3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5"} Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.126081 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.152816 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podStartSLOduration=76.152787229 podStartE2EDuration="1m16.152787229s" podCreationTimestamp="2025-12-01 10:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:04:13.147826667 +0000 UTC m=+300.656615724" watchObservedRunningTime="2025-12-01 10:04:13.152787229 +0000 UTC m=+300.661576266" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.183104 4958 patch_prober.go:28] interesting pod/oauth-openshift-9d745f8b5-xhrxp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:57210->10.217.0.56:6443: read: connection reset by peer" start-of-body= Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.183229 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:57210->10.217.0.56:6443: read: connection reset by peer" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.203453 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.248299 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.356253 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.362464 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.367258 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.421523 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.436478 4958 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.454394 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.536289 4958 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.563705 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.641670 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.810045 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 10:04:13 crc kubenswrapper[4958]: I1201 10:04:13.846244 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.046192 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.133002 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/1.log" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.133534 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/0.log" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.133592 4958 generic.go:334] "Generic (PLEG): container finished" podID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" containerID="3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5" exitCode=255 Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.133642 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerDied","Data":"3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5"} Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.133697 4958 scope.go:117] "RemoveContainer" containerID="af7e5a10d822ece1369f0759691e960df648c7149e7b71c22f2423660f0dc347" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.134378 4958 scope.go:117] "RemoveContainer" containerID="3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5" Dec 01 10:04:14 crc kubenswrapper[4958]: E1201 10:04:14.134799 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-9d745f8b5-xhrxp_openshift-authentication(3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a)\"" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.256165 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.362838 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.390279 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.569285 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.576428 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.604223 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.718576 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.752458 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.920524 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 10:04:14 crc kubenswrapper[4958]: I1201 10:04:14.955182 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.089934 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.140958 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/1.log" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.141825 4958 scope.go:117] "RemoveContainer" containerID="3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5" Dec 01 10:04:15 crc kubenswrapper[4958]: E1201 10:04:15.142118 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-9d745f8b5-xhrxp_openshift-authentication(3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a)\"" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.283215 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.452130 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.462366 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.474975 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.559700 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.642996 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.704667 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 10:04:15 crc kubenswrapper[4958]: I1201 10:04:15.795571 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.018489 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.084519 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.228310 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.267704 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.356658 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.393904 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.406352 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.502977 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.582240 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.698528 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:04:16 crc kubenswrapper[4958]: I1201 10:04:16.747591 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 10:04:17 crc kubenswrapper[4958]: I1201 10:04:17.126295 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 10:04:17 crc kubenswrapper[4958]: I1201 10:04:17.352153 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 10:04:17 crc kubenswrapper[4958]: E1201 10:04:17.621991 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.021817 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.022244 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.159181 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.159252 4958 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d" exitCode=137 Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.159322 4958 scope.go:117] "RemoveContainer" containerID="038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.159355 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.173634 4958 scope.go:117] "RemoveContainer" containerID="038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d" Dec 01 10:04:18 crc kubenswrapper[4958]: E1201 10:04:18.174202 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d\": container with ID starting with 038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d not found: ID does not exist" containerID="038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.174257 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d"} err="failed to get container status \"038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d\": rpc error: code = NotFound desc = could not find container \"038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d\": container with ID starting with 038d2043afaf16e0f06af36db39d335e5ad60abb5693fc63b8017702601adc5d not found: ID does not exist" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177027 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177131 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177170 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177219 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177237 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177571 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177613 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177691 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.177685 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.197221 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.279289 4958 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.279343 4958 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.279355 4958 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.279364 4958 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.279373 4958 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.937135 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:18 crc kubenswrapper[4958]: I1201 10:04:18.938114 4958 scope.go:117] "RemoveContainer" containerID="3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5" Dec 01 10:04:18 crc kubenswrapper[4958]: E1201 10:04:18.938350 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-9d745f8b5-xhrxp_openshift-authentication(3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a)\"" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" Dec 01 10:04:19 crc kubenswrapper[4958]: I1201 10:04:19.804979 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 10:04:19 crc kubenswrapper[4958]: I1201 10:04:19.805818 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 01 10:04:19 crc kubenswrapper[4958]: I1201 10:04:19.818065 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:04:19 crc kubenswrapper[4958]: I1201 10:04:19.818121 4958 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="54b67dbf-6e93-48a8-b77c-26fb7ec9ef5d" Dec 01 10:04:19 crc kubenswrapper[4958]: I1201 10:04:19.822215 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 10:04:19 crc kubenswrapper[4958]: I1201 10:04:19.822264 4958 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="54b67dbf-6e93-48a8-b77c-26fb7ec9ef5d" Dec 01 10:04:27 crc kubenswrapper[4958]: I1201 10:04:27.990658 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 10:04:29 crc kubenswrapper[4958]: I1201 10:04:29.546286 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 10:04:29 crc kubenswrapper[4958]: I1201 10:04:29.797905 4958 scope.go:117] "RemoveContainer" containerID="3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5" Dec 01 10:04:30 crc kubenswrapper[4958]: I1201 10:04:30.237762 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/1.log" Dec 01 10:04:30 crc kubenswrapper[4958]: I1201 10:04:30.238290 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerStarted","Data":"662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a"} Dec 01 10:04:30 crc kubenswrapper[4958]: I1201 10:04:30.239048 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:30 crc kubenswrapper[4958]: I1201 10:04:30.361243 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 10:04:30 crc kubenswrapper[4958]: I1201 10:04:30.405252 4958 patch_prober.go:28] interesting pod/oauth-openshift-9d745f8b5-xhrxp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:58086->10.217.0.56:6443: read: connection reset by peer" start-of-body= Dec 01 10:04:30 crc kubenswrapper[4958]: I1201 10:04:30.405344 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": read tcp 10.217.0.2:58086->10.217.0.56:6443: read: connection reset by peer" Dec 01 10:04:31 crc kubenswrapper[4958]: I1201 10:04:31.246381 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/2.log" Dec 01 10:04:31 crc kubenswrapper[4958]: I1201 10:04:31.249584 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/1.log" Dec 01 10:04:31 crc kubenswrapper[4958]: I1201 10:04:31.249656 4958 generic.go:334] "Generic (PLEG): container finished" podID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" containerID="662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a" exitCode=255 Dec 01 10:04:31 crc kubenswrapper[4958]: I1201 10:04:31.249695 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerDied","Data":"662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a"} Dec 01 10:04:31 crc kubenswrapper[4958]: I1201 10:04:31.249740 4958 scope.go:117] "RemoveContainer" containerID="3028e478c55918202fcda8acddb4eec345086afee03d8d8935a2c398a47ed0f5" Dec 01 10:04:31 crc kubenswrapper[4958]: I1201 10:04:31.250804 4958 scope.go:117] "RemoveContainer" containerID="662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a" Dec 01 10:04:31 crc kubenswrapper[4958]: E1201 10:04:31.251120 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-9d745f8b5-xhrxp_openshift-authentication(3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a)\"" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" Dec 01 10:04:32 crc kubenswrapper[4958]: I1201 10:04:32.078965 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 10:04:32 crc kubenswrapper[4958]: I1201 10:04:32.258727 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/2.log" Dec 01 10:04:32 crc kubenswrapper[4958]: I1201 10:04:32.259589 4958 scope.go:117] "RemoveContainer" containerID="662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a" Dec 01 10:04:32 crc kubenswrapper[4958]: E1201 10:04:32.259819 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-9d745f8b5-xhrxp_openshift-authentication(3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a)\"" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" Dec 01 10:04:38 crc kubenswrapper[4958]: I1201 10:04:38.937133 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:38 crc kubenswrapper[4958]: I1201 10:04:38.938679 4958 scope.go:117] "RemoveContainer" containerID="662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a" Dec 01 10:04:38 crc kubenswrapper[4958]: E1201 10:04:38.939072 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-9d745f8b5-xhrxp_openshift-authentication(3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a)\"" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" podUID="3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a" Dec 01 10:04:40 crc kubenswrapper[4958]: I1201 10:04:40.728933 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 10:04:41 crc kubenswrapper[4958]: I1201 10:04:41.541418 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 10:04:45 crc kubenswrapper[4958]: I1201 10:04:45.737775 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 10:04:47 crc kubenswrapper[4958]: I1201 10:04:47.041019 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 10:04:51 crc kubenswrapper[4958]: I1201 10:04:51.847782 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 10:04:52 crc kubenswrapper[4958]: I1201 10:04:52.369020 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 10:04:53 crc kubenswrapper[4958]: I1201 10:04:53.801030 4958 scope.go:117] "RemoveContainer" containerID="662f2e2d0f482f187c21da372522f15c70b231b44d71eddca60701f98004547a" Dec 01 10:04:54 crc kubenswrapper[4958]: I1201 10:04:54.401012 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-9d745f8b5-xhrxp_3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a/oauth-openshift/2.log" Dec 01 10:04:54 crc kubenswrapper[4958]: I1201 10:04:54.401915 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" event={"ID":"3c0ec892-9bf6-4f2b-b5cb-5f20f7aad99a","Type":"ContainerStarted","Data":"126ddc1404933c9f61e0c0024f2fce3730c7ac1be67d224a970fbc49b79a2f28"} Dec 01 10:04:54 crc kubenswrapper[4958]: I1201 10:04:54.402581 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:54 crc kubenswrapper[4958]: I1201 10:04:54.408178 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9d745f8b5-xhrxp" Dec 01 10:04:55 crc kubenswrapper[4958]: I1201 10:04:55.388323 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:04:57 crc kubenswrapper[4958]: I1201 10:04:57.118763 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 10:04:57 crc kubenswrapper[4958]: I1201 10:04:57.420415 4958 generic.go:334] "Generic (PLEG): container finished" podID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerID="70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721" exitCode=0 Dec 01 10:04:57 crc kubenswrapper[4958]: I1201 10:04:57.420489 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" event={"ID":"fc711edd-1026-4f47-ab7f-92bd6e1fb964","Type":"ContainerDied","Data":"70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721"} Dec 01 10:04:57 crc kubenswrapper[4958]: I1201 10:04:57.421326 4958 scope.go:117] "RemoveContainer" containerID="70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721" Dec 01 10:04:58 crc kubenswrapper[4958]: I1201 10:04:58.431509 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" event={"ID":"fc711edd-1026-4f47-ab7f-92bd6e1fb964","Type":"ContainerStarted","Data":"53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45"} Dec 01 10:04:58 crc kubenswrapper[4958]: I1201 10:04:58.432812 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:04:58 crc kubenswrapper[4958]: I1201 10:04:58.434937 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:05:20 crc kubenswrapper[4958]: I1201 10:05:20.949450 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqfvf"] Dec 01 10:05:20 crc kubenswrapper[4958]: I1201 10:05:20.950442 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqfvf" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="registry-server" containerID="cri-o://559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f" gracePeriod=2 Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.321119 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.482132 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-utilities\") pod \"40c8862e-a0c9-4bdc-9625-c5384c076732\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.482296 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-catalog-content\") pod \"40c8862e-a0c9-4bdc-9625-c5384c076732\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.482359 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/40c8862e-a0c9-4bdc-9625-c5384c076732-kube-api-access-pvbpx\") pod \"40c8862e-a0c9-4bdc-9625-c5384c076732\" (UID: \"40c8862e-a0c9-4bdc-9625-c5384c076732\") " Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.483061 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-utilities" (OuterVolumeSpecName: "utilities") pod "40c8862e-a0c9-4bdc-9625-c5384c076732" (UID: "40c8862e-a0c9-4bdc-9625-c5384c076732"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.491376 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c8862e-a0c9-4bdc-9625-c5384c076732-kube-api-access-pvbpx" (OuterVolumeSpecName: "kube-api-access-pvbpx") pod "40c8862e-a0c9-4bdc-9625-c5384c076732" (UID: "40c8862e-a0c9-4bdc-9625-c5384c076732"). InnerVolumeSpecName "kube-api-access-pvbpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.543358 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40c8862e-a0c9-4bdc-9625-c5384c076732" (UID: "40c8862e-a0c9-4bdc-9625-c5384c076732"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.572450 4958 generic.go:334] "Generic (PLEG): container finished" podID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerID="559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f" exitCode=0 Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.572526 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerDied","Data":"559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f"} Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.572624 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqfvf" event={"ID":"40c8862e-a0c9-4bdc-9625-c5384c076732","Type":"ContainerDied","Data":"aebfc997b2bf47670bc1d9bb44d5b14f92c23155b3c30f16e349e6d15318dc72"} Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.572663 4958 scope.go:117] "RemoveContainer" containerID="559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.572703 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqfvf" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.583481 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.583517 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40c8862e-a0c9-4bdc-9625-c5384c076732-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.583530 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvbpx\" (UniqueName: \"kubernetes.io/projected/40c8862e-a0c9-4bdc-9625-c5384c076732-kube-api-access-pvbpx\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.594152 4958 scope.go:117] "RemoveContainer" containerID="114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.629007 4958 scope.go:117] "RemoveContainer" containerID="bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.632160 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqfvf"] Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.639558 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqfvf"] Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.645823 4958 scope.go:117] "RemoveContainer" containerID="559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f" Dec 01 10:05:21 crc kubenswrapper[4958]: E1201 10:05:21.646413 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f\": container with ID starting with 559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f not found: ID does not exist" containerID="559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.646474 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f"} err="failed to get container status \"559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f\": rpc error: code = NotFound desc = could not find container \"559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f\": container with ID starting with 559d1537f74668496e8e088cac2e8d4cd3b28c92658042a163402fa934163b8f not found: ID does not exist" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.646703 4958 scope.go:117] "RemoveContainer" containerID="114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b" Dec 01 10:05:21 crc kubenswrapper[4958]: E1201 10:05:21.647454 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b\": container with ID starting with 114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b not found: ID does not exist" containerID="114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.647661 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b"} err="failed to get container status \"114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b\": rpc error: code = NotFound desc = could not find container \"114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b\": container with ID starting with 114c5b48a72bea0c8ad6d721a2e1c7cb920f78ff4b02f12d03317d872c324e7b not found: ID does not exist" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.647831 4958 scope.go:117] "RemoveContainer" containerID="bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20" Dec 01 10:05:21 crc kubenswrapper[4958]: E1201 10:05:21.648525 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20\": container with ID starting with bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20 not found: ID does not exist" containerID="bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.648569 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20"} err="failed to get container status \"bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20\": rpc error: code = NotFound desc = could not find container \"bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20\": container with ID starting with bd91dd6051e92eaec2b5b7da0b927aa8818102d7608ba10dba05f9179fcd2a20 not found: ID does not exist" Dec 01 10:05:21 crc kubenswrapper[4958]: I1201 10:05:21.804203 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" path="/var/lib/kubelet/pods/40c8862e-a0c9-4bdc-9625-c5384c076732/volumes" Dec 01 10:05:28 crc kubenswrapper[4958]: I1201 10:05:28.210085 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:05:28 crc kubenswrapper[4958]: I1201 10:05:28.210553 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:05:28 crc kubenswrapper[4958]: I1201 10:05:28.846563 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9f7tt"] Dec 01 10:05:28 crc kubenswrapper[4958]: I1201 10:05:28.847401 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" podUID="478689fc-5c07-45bc-ab87-8c58e68c348b" containerName="controller-manager" containerID="cri-o://2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b" gracePeriod=30 Dec 01 10:05:28 crc kubenswrapper[4958]: I1201 10:05:28.895701 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf"] Dec 01 10:05:28 crc kubenswrapper[4958]: I1201 10:05:28.896057 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" podUID="ea105b82-fec3-4f5d-b056-fdd566619645" containerName="route-controller-manager" containerID="cri-o://5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d" gracePeriod=30 Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.282529 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.348704 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.400552 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7z78\" (UniqueName: \"kubernetes.io/projected/478689fc-5c07-45bc-ab87-8c58e68c348b-kube-api-access-g7z78\") pod \"478689fc-5c07-45bc-ab87-8c58e68c348b\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.400602 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478689fc-5c07-45bc-ab87-8c58e68c348b-serving-cert\") pod \"478689fc-5c07-45bc-ab87-8c58e68c348b\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.400672 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-config\") pod \"478689fc-5c07-45bc-ab87-8c58e68c348b\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.400759 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-proxy-ca-bundles\") pod \"478689fc-5c07-45bc-ab87-8c58e68c348b\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.400805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-client-ca\") pod \"478689fc-5c07-45bc-ab87-8c58e68c348b\" (UID: \"478689fc-5c07-45bc-ab87-8c58e68c348b\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.401728 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-client-ca" (OuterVolumeSpecName: "client-ca") pod "478689fc-5c07-45bc-ab87-8c58e68c348b" (UID: "478689fc-5c07-45bc-ab87-8c58e68c348b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.403977 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-config" (OuterVolumeSpecName: "config") pod "478689fc-5c07-45bc-ab87-8c58e68c348b" (UID: "478689fc-5c07-45bc-ab87-8c58e68c348b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.408600 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "478689fc-5c07-45bc-ab87-8c58e68c348b" (UID: "478689fc-5c07-45bc-ab87-8c58e68c348b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.410335 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478689fc-5c07-45bc-ab87-8c58e68c348b-kube-api-access-g7z78" (OuterVolumeSpecName: "kube-api-access-g7z78") pod "478689fc-5c07-45bc-ab87-8c58e68c348b" (UID: "478689fc-5c07-45bc-ab87-8c58e68c348b"). InnerVolumeSpecName "kube-api-access-g7z78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.411825 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478689fc-5c07-45bc-ab87-8c58e68c348b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "478689fc-5c07-45bc-ab87-8c58e68c348b" (UID: "478689fc-5c07-45bc-ab87-8c58e68c348b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503002 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-config\") pod \"ea105b82-fec3-4f5d-b056-fdd566619645\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503448 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea105b82-fec3-4f5d-b056-fdd566619645-serving-cert\") pod \"ea105b82-fec3-4f5d-b056-fdd566619645\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503507 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkkj9\" (UniqueName: \"kubernetes.io/projected/ea105b82-fec3-4f5d-b056-fdd566619645-kube-api-access-dkkj9\") pod \"ea105b82-fec3-4f5d-b056-fdd566619645\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503565 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-client-ca\") pod \"ea105b82-fec3-4f5d-b056-fdd566619645\" (UID: \"ea105b82-fec3-4f5d-b056-fdd566619645\") " Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503904 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503927 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7z78\" (UniqueName: \"kubernetes.io/projected/478689fc-5c07-45bc-ab87-8c58e68c348b-kube-api-access-g7z78\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503940 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/478689fc-5c07-45bc-ab87-8c58e68c348b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503953 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.503970 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/478689fc-5c07-45bc-ab87-8c58e68c348b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.504772 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea105b82-fec3-4f5d-b056-fdd566619645" (UID: "ea105b82-fec3-4f5d-b056-fdd566619645"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.504813 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-config" (OuterVolumeSpecName: "config") pod "ea105b82-fec3-4f5d-b056-fdd566619645" (UID: "ea105b82-fec3-4f5d-b056-fdd566619645"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.509028 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea105b82-fec3-4f5d-b056-fdd566619645-kube-api-access-dkkj9" (OuterVolumeSpecName: "kube-api-access-dkkj9") pod "ea105b82-fec3-4f5d-b056-fdd566619645" (UID: "ea105b82-fec3-4f5d-b056-fdd566619645"). InnerVolumeSpecName "kube-api-access-dkkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.509236 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea105b82-fec3-4f5d-b056-fdd566619645-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea105b82-fec3-4f5d-b056-fdd566619645" (UID: "ea105b82-fec3-4f5d-b056-fdd566619645"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.606183 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.606234 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea105b82-fec3-4f5d-b056-fdd566619645-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.606244 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea105b82-fec3-4f5d-b056-fdd566619645-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.606255 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkkj9\" (UniqueName: \"kubernetes.io/projected/ea105b82-fec3-4f5d-b056-fdd566619645-kube-api-access-dkkj9\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.623000 4958 generic.go:334] "Generic (PLEG): container finished" podID="ea105b82-fec3-4f5d-b056-fdd566619645" containerID="5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d" exitCode=0 Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.623074 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" event={"ID":"ea105b82-fec3-4f5d-b056-fdd566619645","Type":"ContainerDied","Data":"5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d"} Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.623146 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" event={"ID":"ea105b82-fec3-4f5d-b056-fdd566619645","Type":"ContainerDied","Data":"6733dc635566afd9330b71b35c9756f01b0294c6e995d7a2bba7e6c68f0437cd"} Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.623176 4958 scope.go:117] "RemoveContainer" containerID="5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.623535 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.626125 4958 generic.go:334] "Generic (PLEG): container finished" podID="478689fc-5c07-45bc-ab87-8c58e68c348b" containerID="2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b" exitCode=0 Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.626168 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.626192 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" event={"ID":"478689fc-5c07-45bc-ab87-8c58e68c348b","Type":"ContainerDied","Data":"2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b"} Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.626238 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9f7tt" event={"ID":"478689fc-5c07-45bc-ab87-8c58e68c348b","Type":"ContainerDied","Data":"80606ffb7572ecc1674726a52016096f571e96ee51d3a98e829c72e19d545927"} Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.643681 4958 scope.go:117] "RemoveContainer" containerID="5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.644378 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d\": container with ID starting with 5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d not found: ID does not exist" containerID="5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.644425 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d"} err="failed to get container status \"5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d\": rpc error: code = NotFound desc = could not find container \"5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d\": container with ID starting with 5d6d33216058c33b2bfbc64185d41787097ba70ba1d8810539171d12bd3d951d not found: ID does not exist" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.644456 4958 scope.go:117] "RemoveContainer" containerID="2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.662084 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.668637 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-46smf"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.670262 4958 scope.go:117] "RemoveContainer" containerID="2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.671092 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b\": container with ID starting with 2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b not found: ID does not exist" containerID="2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.671143 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b"} err="failed to get container status \"2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b\": rpc error: code = NotFound desc = could not find container \"2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b\": container with ID starting with 2664bb00c03ef18744a7319f577d5b9df6a805dbe151389b1366466c6496604b not found: ID does not exist" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.684450 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9f7tt"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.689399 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9f7tt"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.737956 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9"] Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.738292 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738307 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.738319 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea105b82-fec3-4f5d-b056-fdd566619645" containerName="route-controller-manager" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738326 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea105b82-fec3-4f5d-b056-fdd566619645" containerName="route-controller-manager" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.738391 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="registry-server" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738400 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="registry-server" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.738422 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="extract-utilities" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738429 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="extract-utilities" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.738438 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478689fc-5c07-45bc-ab87-8c58e68c348b" containerName="controller-manager" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738444 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="478689fc-5c07-45bc-ab87-8c58e68c348b" containerName="controller-manager" Dec 01 10:05:29 crc kubenswrapper[4958]: E1201 10:05:29.738455 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="extract-content" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738460 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="extract-content" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738553 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738565 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c8862e-a0c9-4bdc-9625-c5384c076732" containerName="registry-server" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738575 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea105b82-fec3-4f5d-b056-fdd566619645" containerName="route-controller-manager" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.738588 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="478689fc-5c07-45bc-ab87-8c58e68c348b" containerName="controller-manager" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.739072 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.744705 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.745155 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.745071 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.745161 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.745425 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.746052 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.757315 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.758628 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.806554 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478689fc-5c07-45bc-ab87-8c58e68c348b" path="/var/lib/kubelet/pods/478689fc-5c07-45bc-ab87-8c58e68c348b/volumes" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.807356 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea105b82-fec3-4f5d-b056-fdd566619645" path="/var/lib/kubelet/pods/ea105b82-fec3-4f5d-b056-fdd566619645/volumes" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.869244 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.870383 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.873870 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.874054 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.874447 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.875094 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.876013 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.883230 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9"] Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.885233 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.910089 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-config\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.910167 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2b54ea-577d-49c8-895f-1166691be46e-serving-cert\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.910208 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-client-ca\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.910232 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh7pf\" (UniqueName: \"kubernetes.io/projected/7e2b54ea-577d-49c8-895f-1166691be46e-kube-api-access-vh7pf\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:29 crc kubenswrapper[4958]: I1201 10:05:29.910268 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-proxy-ca-bundles\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.011740 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-proxy-ca-bundles\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.011824 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-config\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.011918 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-client-ca\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.011950 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d6fe35-e373-4e06-825a-d5596b7e78c8-serving-cert\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.012017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-config\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.012048 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2b54ea-577d-49c8-895f-1166691be46e-serving-cert\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.012079 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-client-ca\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.012105 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh7pf\" (UniqueName: \"kubernetes.io/projected/7e2b54ea-577d-49c8-895f-1166691be46e-kube-api-access-vh7pf\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.012135 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6knwj\" (UniqueName: \"kubernetes.io/projected/c8d6fe35-e373-4e06-825a-d5596b7e78c8-kube-api-access-6knwj\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.014291 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-proxy-ca-bundles\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.014421 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-config\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.015396 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-client-ca\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.019737 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2b54ea-577d-49c8-895f-1166691be46e-serving-cert\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.034057 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh7pf\" (UniqueName: \"kubernetes.io/projected/7e2b54ea-577d-49c8-895f-1166691be46e-kube-api-access-vh7pf\") pod \"controller-manager-55cf5b8bc-kj2d9\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.105131 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.113531 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-config\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.113595 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-client-ca\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.113629 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d6fe35-e373-4e06-825a-d5596b7e78c8-serving-cert\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.113708 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6knwj\" (UniqueName: \"kubernetes.io/projected/c8d6fe35-e373-4e06-825a-d5596b7e78c8-kube-api-access-6knwj\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.115079 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-config\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.115164 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-client-ca\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.122452 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d6fe35-e373-4e06-825a-d5596b7e78c8-serving-cert\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.137379 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6knwj\" (UniqueName: \"kubernetes.io/projected/c8d6fe35-e373-4e06-825a-d5596b7e78c8-kube-api-access-6knwj\") pod \"route-controller-manager-6587bc8775-jd2q9\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.193470 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.315717 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9"] Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.436765 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9"] Dec 01 10:05:30 crc kubenswrapper[4958]: W1201 10:05:30.442943 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d6fe35_e373_4e06_825a_d5596b7e78c8.slice/crio-3f6b0187409195f52434d4b61cafd9ed5769de95470c6afe11671b92f98ee6d0 WatchSource:0}: Error finding container 3f6b0187409195f52434d4b61cafd9ed5769de95470c6afe11671b92f98ee6d0: Status 404 returned error can't find the container with id 3f6b0187409195f52434d4b61cafd9ed5769de95470c6afe11671b92f98ee6d0 Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.635726 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" event={"ID":"c8d6fe35-e373-4e06-825a-d5596b7e78c8","Type":"ContainerStarted","Data":"ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562"} Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.636145 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" event={"ID":"c8d6fe35-e373-4e06-825a-d5596b7e78c8","Type":"ContainerStarted","Data":"3f6b0187409195f52434d4b61cafd9ed5769de95470c6afe11671b92f98ee6d0"} Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.639089 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" event={"ID":"7e2b54ea-577d-49c8-895f-1166691be46e","Type":"ContainerStarted","Data":"c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490"} Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.639148 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" event={"ID":"7e2b54ea-577d-49c8-895f-1166691be46e","Type":"ContainerStarted","Data":"16514c88976e5f99fd7a5663a4adeef4692f0e2504774f77991cfbc9819cf1e0"} Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.639357 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.645720 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.658085 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" podStartSLOduration=1.658049384 podStartE2EDuration="1.658049384s" podCreationTimestamp="2025-12-01 10:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:05:30.652409683 +0000 UTC m=+378.161198730" watchObservedRunningTime="2025-12-01 10:05:30.658049384 +0000 UTC m=+378.166838421" Dec 01 10:05:30 crc kubenswrapper[4958]: I1201 10:05:30.672873 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" podStartSLOduration=1.672819005 podStartE2EDuration="1.672819005s" podCreationTimestamp="2025-12-01 10:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:05:30.669730377 +0000 UTC m=+378.178519434" watchObservedRunningTime="2025-12-01 10:05:30.672819005 +0000 UTC m=+378.181608242" Dec 01 10:05:31 crc kubenswrapper[4958]: I1201 10:05:31.647829 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:31 crc kubenswrapper[4958]: I1201 10:05:31.654301 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:48 crc kubenswrapper[4958]: I1201 10:05:48.949477 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9"] Dec 01 10:05:48 crc kubenswrapper[4958]: I1201 10:05:48.950504 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" podUID="7e2b54ea-577d-49c8-895f-1166691be46e" containerName="controller-manager" containerID="cri-o://c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490" gracePeriod=30 Dec 01 10:05:48 crc kubenswrapper[4958]: I1201 10:05:48.963664 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9"] Dec 01 10:05:48 crc kubenswrapper[4958]: I1201 10:05:48.964036 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" podUID="c8d6fe35-e373-4e06-825a-d5596b7e78c8" containerName="route-controller-manager" containerID="cri-o://ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562" gracePeriod=30 Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.553961 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.602991 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.717921 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-client-ca\") pod \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718071 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d6fe35-e373-4e06-825a-d5596b7e78c8-serving-cert\") pod \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718117 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-proxy-ca-bundles\") pod \"7e2b54ea-577d-49c8-895f-1166691be46e\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718142 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-config\") pod \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718237 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6knwj\" (UniqueName: \"kubernetes.io/projected/c8d6fe35-e373-4e06-825a-d5596b7e78c8-kube-api-access-6knwj\") pod \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\" (UID: \"c8d6fe35-e373-4e06-825a-d5596b7e78c8\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718265 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh7pf\" (UniqueName: \"kubernetes.io/projected/7e2b54ea-577d-49c8-895f-1166691be46e-kube-api-access-vh7pf\") pod \"7e2b54ea-577d-49c8-895f-1166691be46e\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718298 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-client-ca\") pod \"7e2b54ea-577d-49c8-895f-1166691be46e\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718317 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-config\") pod \"7e2b54ea-577d-49c8-895f-1166691be46e\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2b54ea-577d-49c8-895f-1166691be46e-serving-cert\") pod \"7e2b54ea-577d-49c8-895f-1166691be46e\" (UID: \"7e2b54ea-577d-49c8-895f-1166691be46e\") " Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.718962 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-config" (OuterVolumeSpecName: "config") pod "c8d6fe35-e373-4e06-825a-d5596b7e78c8" (UID: "c8d6fe35-e373-4e06-825a-d5596b7e78c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.719100 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7e2b54ea-577d-49c8-895f-1166691be46e" (UID: "7e2b54ea-577d-49c8-895f-1166691be46e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.719257 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "c8d6fe35-e373-4e06-825a-d5596b7e78c8" (UID: "c8d6fe35-e373-4e06-825a-d5596b7e78c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.719343 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e2b54ea-577d-49c8-895f-1166691be46e" (UID: "7e2b54ea-577d-49c8-895f-1166691be46e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.719944 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-config" (OuterVolumeSpecName: "config") pod "7e2b54ea-577d-49c8-895f-1166691be46e" (UID: "7e2b54ea-577d-49c8-895f-1166691be46e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.724273 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2b54ea-577d-49c8-895f-1166691be46e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e2b54ea-577d-49c8-895f-1166691be46e" (UID: "7e2b54ea-577d-49c8-895f-1166691be46e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.724989 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8d6fe35-e373-4e06-825a-d5596b7e78c8-kube-api-access-6knwj" (OuterVolumeSpecName: "kube-api-access-6knwj") pod "c8d6fe35-e373-4e06-825a-d5596b7e78c8" (UID: "c8d6fe35-e373-4e06-825a-d5596b7e78c8"). InnerVolumeSpecName "kube-api-access-6knwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.725052 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2b54ea-577d-49c8-895f-1166691be46e-kube-api-access-vh7pf" (OuterVolumeSpecName: "kube-api-access-vh7pf") pod "7e2b54ea-577d-49c8-895f-1166691be46e" (UID: "7e2b54ea-577d-49c8-895f-1166691be46e"). InnerVolumeSpecName "kube-api-access-vh7pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.725130 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8d6fe35-e373-4e06-825a-d5596b7e78c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c8d6fe35-e373-4e06-825a-d5596b7e78c8" (UID: "c8d6fe35-e373-4e06-825a-d5596b7e78c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.756059 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8d6fe35-e373-4e06-825a-d5596b7e78c8" containerID="ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562" exitCode=0 Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.756131 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" event={"ID":"c8d6fe35-e373-4e06-825a-d5596b7e78c8","Type":"ContainerDied","Data":"ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562"} Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.756171 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.756214 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9" event={"ID":"c8d6fe35-e373-4e06-825a-d5596b7e78c8","Type":"ContainerDied","Data":"3f6b0187409195f52434d4b61cafd9ed5769de95470c6afe11671b92f98ee6d0"} Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.756232 4958 scope.go:117] "RemoveContainer" containerID="ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.759525 4958 generic.go:334] "Generic (PLEG): container finished" podID="7e2b54ea-577d-49c8-895f-1166691be46e" containerID="c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490" exitCode=0 Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.759583 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.759581 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" event={"ID":"7e2b54ea-577d-49c8-895f-1166691be46e","Type":"ContainerDied","Data":"c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490"} Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.759752 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9" event={"ID":"7e2b54ea-577d-49c8-895f-1166691be46e","Type":"ContainerDied","Data":"16514c88976e5f99fd7a5663a4adeef4692f0e2504774f77991cfbc9819cf1e0"} Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.785636 4958 scope.go:117] "RemoveContainer" containerID="ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562" Dec 01 10:05:49 crc kubenswrapper[4958]: E1201 10:05:49.786152 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562\": container with ID starting with ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562 not found: ID does not exist" containerID="ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.786211 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562"} err="failed to get container status \"ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562\": rpc error: code = NotFound desc = could not find container \"ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562\": container with ID starting with ac8813e8293099678bc89b4cbafe49de1792db4cd816091667292b3f65edc562 not found: ID does not exist" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.786246 4958 scope.go:117] "RemoveContainer" containerID="c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.814959 4958 scope.go:117] "RemoveContainer" containerID="c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490" Dec 01 10:05:49 crc kubenswrapper[4958]: E1201 10:05:49.817892 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490\": container with ID starting with c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490 not found: ID does not exist" containerID="c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.817931 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490"} err="failed to get container status \"c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490\": rpc error: code = NotFound desc = could not find container \"c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490\": container with ID starting with c666b51b01f4c9317a9fac484ae621b23a96b14616305688ce237dbf7c633490 not found: ID does not exist" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.818990 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9"] Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.819035 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6587bc8775-jd2q9"] Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820738 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e2b54ea-577d-49c8-895f-1166691be46e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820770 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820781 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d6fe35-e373-4e06-825a-d5596b7e78c8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820791 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820804 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d6fe35-e373-4e06-825a-d5596b7e78c8-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820814 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6knwj\" (UniqueName: \"kubernetes.io/projected/c8d6fe35-e373-4e06-825a-d5596b7e78c8-kube-api-access-6knwj\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820825 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh7pf\" (UniqueName: \"kubernetes.io/projected/7e2b54ea-577d-49c8-895f-1166691be46e-kube-api-access-vh7pf\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820835 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.820864 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2b54ea-577d-49c8-895f-1166691be46e-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.824290 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9"] Dec 01 10:05:49 crc kubenswrapper[4958]: I1201 10:05:49.830506 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55cf5b8bc-kj2d9"] Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.044724 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86d6955784-9gc7j"] Dec 01 10:05:50 crc kubenswrapper[4958]: E1201 10:05:50.045578 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2b54ea-577d-49c8-895f-1166691be46e" containerName="controller-manager" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.045601 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2b54ea-577d-49c8-895f-1166691be46e" containerName="controller-manager" Dec 01 10:05:50 crc kubenswrapper[4958]: E1201 10:05:50.045630 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8d6fe35-e373-4e06-825a-d5596b7e78c8" containerName="route-controller-manager" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.045638 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8d6fe35-e373-4e06-825a-d5596b7e78c8" containerName="route-controller-manager" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.045765 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2b54ea-577d-49c8-895f-1166691be46e" containerName="controller-manager" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.045787 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8d6fe35-e373-4e06-825a-d5596b7e78c8" containerName="route-controller-manager" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.046419 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.048895 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4"] Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.049188 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.049292 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.049592 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.049743 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.050026 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.050216 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.050230 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.051395 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.052436 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.052618 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.052772 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.054255 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.054493 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.060596 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.077931 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4"] Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.079875 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d6955784-9gc7j"] Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.126468 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-proxy-ca-bundles\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.126565 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7906a376-2f96-4487-a629-cabb7f7cad15-serving-cert\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.126599 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-client-ca\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.126657 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7906a376-2f96-4487-a629-cabb7f7cad15-client-ca\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.126945 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvs6\" (UniqueName: \"kubernetes.io/projected/afcbbc20-2c1a-499f-8fdc-675ef761ca61-kube-api-access-5nvs6\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.127095 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906a376-2f96-4487-a629-cabb7f7cad15-config\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.127193 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-config\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.127254 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afcbbc20-2c1a-499f-8fdc-675ef761ca61-serving-cert\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.127362 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprc6\" (UniqueName: \"kubernetes.io/projected/7906a376-2f96-4487-a629-cabb7f7cad15-kube-api-access-qprc6\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.227955 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7906a376-2f96-4487-a629-cabb7f7cad15-client-ca\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228023 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvs6\" (UniqueName: \"kubernetes.io/projected/afcbbc20-2c1a-499f-8fdc-675ef761ca61-kube-api-access-5nvs6\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228065 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906a376-2f96-4487-a629-cabb7f7cad15-config\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-config\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228121 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afcbbc20-2c1a-499f-8fdc-675ef761ca61-serving-cert\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228148 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprc6\" (UniqueName: \"kubernetes.io/projected/7906a376-2f96-4487-a629-cabb7f7cad15-kube-api-access-qprc6\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-proxy-ca-bundles\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228206 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7906a376-2f96-4487-a629-cabb7f7cad15-serving-cert\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.228225 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-client-ca\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.229200 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7906a376-2f96-4487-a629-cabb7f7cad15-client-ca\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.229203 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-client-ca\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.229563 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7906a376-2f96-4487-a629-cabb7f7cad15-config\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.230671 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-config\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.230682 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afcbbc20-2c1a-499f-8fdc-675ef761ca61-proxy-ca-bundles\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.232729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afcbbc20-2c1a-499f-8fdc-675ef761ca61-serving-cert\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.236744 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7906a376-2f96-4487-a629-cabb7f7cad15-serving-cert\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.248138 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvs6\" (UniqueName: \"kubernetes.io/projected/afcbbc20-2c1a-499f-8fdc-675ef761ca61-kube-api-access-5nvs6\") pod \"controller-manager-86d6955784-9gc7j\" (UID: \"afcbbc20-2c1a-499f-8fdc-675ef761ca61\") " pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.258470 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprc6\" (UniqueName: \"kubernetes.io/projected/7906a376-2f96-4487-a629-cabb7f7cad15-kube-api-access-qprc6\") pod \"route-controller-manager-df97466d4-fmrl4\" (UID: \"7906a376-2f96-4487-a629-cabb7f7cad15\") " pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.366041 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.393546 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.614027 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d6955784-9gc7j"] Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.665347 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4"] Dec 01 10:05:50 crc kubenswrapper[4958]: W1201 10:05:50.673551 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7906a376_2f96_4487_a629_cabb7f7cad15.slice/crio-b582f97968dd6ea1e9a73cc09963455821e3cf82a227e9547eb0802e8a281479 WatchSource:0}: Error finding container b582f97968dd6ea1e9a73cc09963455821e3cf82a227e9547eb0802e8a281479: Status 404 returned error can't find the container with id b582f97968dd6ea1e9a73cc09963455821e3cf82a227e9547eb0802e8a281479 Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.778916 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" event={"ID":"7906a376-2f96-4487-a629-cabb7f7cad15","Type":"ContainerStarted","Data":"b582f97968dd6ea1e9a73cc09963455821e3cf82a227e9547eb0802e8a281479"} Dec 01 10:05:50 crc kubenswrapper[4958]: I1201 10:05:50.782674 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" event={"ID":"afcbbc20-2c1a-499f-8fdc-675ef761ca61","Type":"ContainerStarted","Data":"f51683dfe9f8ce22d830deae47d8bcfd08ce63ce6c82730e9c43661e1816fe18"} Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.791192 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" event={"ID":"afcbbc20-2c1a-499f-8fdc-675ef761ca61","Type":"ContainerStarted","Data":"26f423101872fa61364068f6ab407e77e4aa07fe15d5adb48729cc276520bce2"} Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.792221 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.793866 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" event={"ID":"7906a376-2f96-4487-a629-cabb7f7cad15","Type":"ContainerStarted","Data":"6fef1f65359a26dcaf05aa3a31b0c48c0f17af37e4f74ecb7a77df4c0a425f63"} Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.794156 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.811507 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2b54ea-577d-49c8-895f-1166691be46e" path="/var/lib/kubelet/pods/7e2b54ea-577d-49c8-895f-1166691be46e/volumes" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.813102 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d6fe35-e373-4e06-825a-d5596b7e78c8" path="/var/lib/kubelet/pods/c8d6fe35-e373-4e06-825a-d5596b7e78c8/volumes" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.814139 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.814193 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.817645 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86d6955784-9gc7j" podStartSLOduration=3.817612604 podStartE2EDuration="3.817612604s" podCreationTimestamp="2025-12-01 10:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:05:51.812713964 +0000 UTC m=+399.321503001" watchObservedRunningTime="2025-12-01 10:05:51.817612604 +0000 UTC m=+399.326401641" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.834342 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-df97466d4-fmrl4" podStartSLOduration=3.834316669 podStartE2EDuration="3.834316669s" podCreationTimestamp="2025-12-01 10:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:05:51.833025942 +0000 UTC m=+399.341814999" watchObservedRunningTime="2025-12-01 10:05:51.834316669 +0000 UTC m=+399.343105706" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.884035 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9k5l5"] Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.884399 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9k5l5" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="registry-server" containerID="cri-o://9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497" gracePeriod=30 Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.911013 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z5z9s"] Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.911435 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z5z9s" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="registry-server" containerID="cri-o://bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1" gracePeriod=30 Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.953979 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6q7kk"] Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.954347 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" containerID="cri-o://53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45" gracePeriod=30 Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.966952 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdr2q"] Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.967387 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fdr2q" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="registry-server" containerID="cri-o://b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145" gracePeriod=30 Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.979065 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5m4k"] Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.979491 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j5m4k" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="registry-server" containerID="cri-o://956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21" gracePeriod=30 Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.985131 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c99nd"] Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.988442 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:51 crc kubenswrapper[4958]: I1201 10:05:51.989251 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c99nd"] Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.160626 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/905b7c43-8ebe-4cd3-a076-ff77e791de7b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.160698 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwv2\" (UniqueName: \"kubernetes.io/projected/905b7c43-8ebe-4cd3-a076-ff77e791de7b-kube-api-access-sgwv2\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.160733 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/905b7c43-8ebe-4cd3-a076-ff77e791de7b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.264229 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/905b7c43-8ebe-4cd3-a076-ff77e791de7b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.264303 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwv2\" (UniqueName: \"kubernetes.io/projected/905b7c43-8ebe-4cd3-a076-ff77e791de7b-kube-api-access-sgwv2\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.264333 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/905b7c43-8ebe-4cd3-a076-ff77e791de7b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.266773 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/905b7c43-8ebe-4cd3-a076-ff77e791de7b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.273799 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/905b7c43-8ebe-4cd3-a076-ff77e791de7b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.286728 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwv2\" (UniqueName: \"kubernetes.io/projected/905b7c43-8ebe-4cd3-a076-ff77e791de7b-kube-api-access-sgwv2\") pod \"marketplace-operator-79b997595-c99nd\" (UID: \"905b7c43-8ebe-4cd3-a076-ff77e791de7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.350636 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.462842 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.569543 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-catalog-content\") pod \"2b165b67-cdac-49bb-969f-aff516fa216d\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.569824 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-utilities\") pod \"2b165b67-cdac-49bb-969f-aff516fa216d\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.569922 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns986\" (UniqueName: \"kubernetes.io/projected/2b165b67-cdac-49bb-969f-aff516fa216d-kube-api-access-ns986\") pod \"2b165b67-cdac-49bb-969f-aff516fa216d\" (UID: \"2b165b67-cdac-49bb-969f-aff516fa216d\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.571918 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-utilities" (OuterVolumeSpecName: "utilities") pod "2b165b67-cdac-49bb-969f-aff516fa216d" (UID: "2b165b67-cdac-49bb-969f-aff516fa216d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.576112 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b165b67-cdac-49bb-969f-aff516fa216d-kube-api-access-ns986" (OuterVolumeSpecName: "kube-api-access-ns986") pod "2b165b67-cdac-49bb-969f-aff516fa216d" (UID: "2b165b67-cdac-49bb-969f-aff516fa216d"). InnerVolumeSpecName "kube-api-access-ns986". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.611420 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.639234 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b165b67-cdac-49bb-969f-aff516fa216d" (UID: "2b165b67-cdac-49bb-969f-aff516fa216d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.641160 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.643034 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.655257 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.671983 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.672026 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns986\" (UniqueName: \"kubernetes.io/projected/2b165b67-cdac-49bb-969f-aff516fa216d-kube-api-access-ns986\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.672037 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b165b67-cdac-49bb-969f-aff516fa216d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775195 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5f8x\" (UniqueName: \"kubernetes.io/projected/144ccc0c-43b4-4146-a91a-b235916a6a0e-kube-api-access-c5f8x\") pod \"144ccc0c-43b4-4146-a91a-b235916a6a0e\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775511 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-catalog-content\") pod \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775545 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-utilities\") pod \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775578 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5zch\" (UniqueName: \"kubernetes.io/projected/ba3175fa-f3c1-4377-9489-ef0fbed933bc-kube-api-access-q5zch\") pod \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775605 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-catalog-content\") pod \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775632 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-operator-metrics\") pod \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775677 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-catalog-content\") pod \"144ccc0c-43b4-4146-a91a-b235916a6a0e\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775705 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-trusted-ca\") pod \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775723 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-utilities\") pod \"144ccc0c-43b4-4146-a91a-b235916a6a0e\" (UID: \"144ccc0c-43b4-4146-a91a-b235916a6a0e\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775739 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-utilities\") pod \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\" (UID: \"ba3175fa-f3c1-4377-9489-ef0fbed933bc\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775760 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9b7\" (UniqueName: \"kubernetes.io/projected/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-kube-api-access-bx9b7\") pod \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\" (UID: \"d56eb44a-f6f7-4b29-be59-84e10ab7c34a\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.775784 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9fnw\" (UniqueName: \"kubernetes.io/projected/fc711edd-1026-4f47-ab7f-92bd6e1fb964-kube-api-access-v9fnw\") pod \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\" (UID: \"fc711edd-1026-4f47-ab7f-92bd6e1fb964\") " Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.778197 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-utilities" (OuterVolumeSpecName: "utilities") pod "ba3175fa-f3c1-4377-9489-ef0fbed933bc" (UID: "ba3175fa-f3c1-4377-9489-ef0fbed933bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.778506 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-utilities" (OuterVolumeSpecName: "utilities") pod "d56eb44a-f6f7-4b29-be59-84e10ab7c34a" (UID: "d56eb44a-f6f7-4b29-be59-84e10ab7c34a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.778932 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "fc711edd-1026-4f47-ab7f-92bd6e1fb964" (UID: "fc711edd-1026-4f47-ab7f-92bd6e1fb964"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.784334 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3175fa-f3c1-4377-9489-ef0fbed933bc-kube-api-access-q5zch" (OuterVolumeSpecName: "kube-api-access-q5zch") pod "ba3175fa-f3c1-4377-9489-ef0fbed933bc" (UID: "ba3175fa-f3c1-4377-9489-ef0fbed933bc"). InnerVolumeSpecName "kube-api-access-q5zch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.794907 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-utilities" (OuterVolumeSpecName: "utilities") pod "144ccc0c-43b4-4146-a91a-b235916a6a0e" (UID: "144ccc0c-43b4-4146-a91a-b235916a6a0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.798182 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba3175fa-f3c1-4377-9489-ef0fbed933bc" (UID: "ba3175fa-f3c1-4377-9489-ef0fbed933bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.801692 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc711edd-1026-4f47-ab7f-92bd6e1fb964-kube-api-access-v9fnw" (OuterVolumeSpecName: "kube-api-access-v9fnw") pod "fc711edd-1026-4f47-ab7f-92bd6e1fb964" (UID: "fc711edd-1026-4f47-ab7f-92bd6e1fb964"). InnerVolumeSpecName "kube-api-access-v9fnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.802070 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "fc711edd-1026-4f47-ab7f-92bd6e1fb964" (UID: "fc711edd-1026-4f47-ab7f-92bd6e1fb964"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.802451 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-kube-api-access-bx9b7" (OuterVolumeSpecName: "kube-api-access-bx9b7") pod "d56eb44a-f6f7-4b29-be59-84e10ab7c34a" (UID: "d56eb44a-f6f7-4b29-be59-84e10ab7c34a"). InnerVolumeSpecName "kube-api-access-bx9b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.813900 4958 generic.go:334] "Generic (PLEG): container finished" podID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerID="53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45" exitCode=0 Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.817421 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" event={"ID":"fc711edd-1026-4f47-ab7f-92bd6e1fb964","Type":"ContainerDied","Data":"53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.817643 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" event={"ID":"fc711edd-1026-4f47-ab7f-92bd6e1fb964","Type":"ContainerDied","Data":"19d856b604097473be43874de21bb8d3cdb41bbad7f12e995774e2eb81410177"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.817754 4958 scope.go:117] "RemoveContainer" containerID="53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.816384 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144ccc0c-43b4-4146-a91a-b235916a6a0e-kube-api-access-c5f8x" (OuterVolumeSpecName: "kube-api-access-c5f8x") pod "144ccc0c-43b4-4146-a91a-b235916a6a0e" (UID: "144ccc0c-43b4-4146-a91a-b235916a6a0e"). InnerVolumeSpecName "kube-api-access-c5f8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.816596 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6q7kk" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.855712 4958 generic.go:334] "Generic (PLEG): container finished" podID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerID="956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21" exitCode=0 Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.855868 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerDied","Data":"956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.855913 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5m4k" event={"ID":"d56eb44a-f6f7-4b29-be59-84e10ab7c34a","Type":"ContainerDied","Data":"75b010d0861591344f4e3c5035f247b24cee9c76c97b2d598f5f2b7f2ea5dccd"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.856033 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5m4k" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.856845 4958 scope.go:117] "RemoveContainer" containerID="70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.867563 4958 generic.go:334] "Generic (PLEG): container finished" podID="2b165b67-cdac-49bb-969f-aff516fa216d" containerID="9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497" exitCode=0 Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.867660 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerDied","Data":"9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.867707 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9k5l5" event={"ID":"2b165b67-cdac-49bb-969f-aff516fa216d","Type":"ContainerDied","Data":"eaa57abd71f2847084b7e3da5c033662a056024569580b5040f172dac09beba3"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.867830 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9k5l5" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.870889 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6q7kk"] Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.876481 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5f8x\" (UniqueName: \"kubernetes.io/projected/144ccc0c-43b4-4146-a91a-b235916a6a0e-kube-api-access-c5f8x\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877237 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877359 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877449 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5zch\" (UniqueName: \"kubernetes.io/projected/ba3175fa-f3c1-4377-9489-ef0fbed933bc-kube-api-access-q5zch\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877541 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877629 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc711edd-1026-4f47-ab7f-92bd6e1fb964-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877727 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3175fa-f3c1-4377-9489-ef0fbed933bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877811 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.878104 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9b7\" (UniqueName: \"kubernetes.io/projected/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-kube-api-access-bx9b7\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.878290 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9fnw\" (UniqueName: \"kubernetes.io/projected/fc711edd-1026-4f47-ab7f-92bd6e1fb964-kube-api-access-v9fnw\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.878166 4958 scope.go:117] "RemoveContainer" containerID="53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.877726 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6q7kk"] Dec 01 10:05:52 crc kubenswrapper[4958]: E1201 10:05:52.879199 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45\": container with ID starting with 53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45 not found: ID does not exist" containerID="53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.879333 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45"} err="failed to get container status \"53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45\": rpc error: code = NotFound desc = could not find container \"53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45\": container with ID starting with 53b61920890014ab54b7f45823e67d89fe098871b9eebac21eb46e5a7e3f5d45 not found: ID does not exist" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.879460 4958 scope.go:117] "RemoveContainer" containerID="70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.879919 4958 generic.go:334] "Generic (PLEG): container finished" podID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerID="bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1" exitCode=0 Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.880077 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z5z9s" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.880127 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerDied","Data":"bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.880181 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z5z9s" event={"ID":"144ccc0c-43b4-4146-a91a-b235916a6a0e","Type":"ContainerDied","Data":"6cb109a0e73fa2e51570b5bd8fc175edb9cb9135507e6ea67e29a90e057f117f"} Dec 01 10:05:52 crc kubenswrapper[4958]: E1201 10:05:52.879947 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721\": container with ID starting with 70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721 not found: ID does not exist" containerID="70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.880423 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721"} err="failed to get container status \"70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721\": rpc error: code = NotFound desc = could not find container \"70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721\": container with ID starting with 70c4a51cee69f977e551d4ceb94bec24d6afb0627929b0d25855d6c7450fa721 not found: ID does not exist" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.880836 4958 scope.go:117] "RemoveContainer" containerID="956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.890557 4958 generic.go:334] "Generic (PLEG): container finished" podID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerID="b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145" exitCode=0 Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.890870 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerDied","Data":"b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.890906 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fdr2q" event={"ID":"ba3175fa-f3c1-4377-9489-ef0fbed933bc","Type":"ContainerDied","Data":"a1ea33e24bd76cab747bdf5a02afadafeccfec2c52d1a85457a5917e0cd02484"} Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.891003 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fdr2q" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.910254 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "144ccc0c-43b4-4146-a91a-b235916a6a0e" (UID: "144ccc0c-43b4-4146-a91a-b235916a6a0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.923402 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9k5l5"] Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.926330 4958 scope.go:117] "RemoveContainer" containerID="61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.944999 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d56eb44a-f6f7-4b29-be59-84e10ab7c34a" (UID: "d56eb44a-f6f7-4b29-be59-84e10ab7c34a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.963809 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9k5l5"] Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.976667 4958 scope.go:117] "RemoveContainer" containerID="80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.980559 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d56eb44a-f6f7-4b29-be59-84e10ab7c34a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.980616 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144ccc0c-43b4-4146-a91a-b235916a6a0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.988498 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c99nd"] Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.994638 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdr2q"] Dec 01 10:05:52 crc kubenswrapper[4958]: I1201 10:05:52.998844 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fdr2q"] Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.018761 4958 scope.go:117] "RemoveContainer" containerID="956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.019586 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21\": container with ID starting with 956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21 not found: ID does not exist" containerID="956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.019658 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21"} err="failed to get container status \"956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21\": rpc error: code = NotFound desc = could not find container \"956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21\": container with ID starting with 956a417111f579456b8caae3b09aebc7baea70ebf0d448e3514f1f2a9f831f21 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.019700 4958 scope.go:117] "RemoveContainer" containerID="61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.021344 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469\": container with ID starting with 61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469 not found: ID does not exist" containerID="61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.021444 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469"} err="failed to get container status \"61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469\": rpc error: code = NotFound desc = could not find container \"61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469\": container with ID starting with 61102d693a1a106dbe9b99faeabb85f2f476706b5a783b050c9e05027c237469 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.021511 4958 scope.go:117] "RemoveContainer" containerID="80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.022040 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7\": container with ID starting with 80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7 not found: ID does not exist" containerID="80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.022110 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7"} err="failed to get container status \"80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7\": rpc error: code = NotFound desc = could not find container \"80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7\": container with ID starting with 80c7c4c0f61f0bfee0b53db3c69c2cf567ca5522601eb4e56d12697dad469ba7 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.022156 4958 scope.go:117] "RemoveContainer" containerID="9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.049230 4958 scope.go:117] "RemoveContainer" containerID="9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.076012 4958 scope.go:117] "RemoveContainer" containerID="630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.150862 4958 scope.go:117] "RemoveContainer" containerID="9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.151550 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497\": container with ID starting with 9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497 not found: ID does not exist" containerID="9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.151638 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497"} err="failed to get container status \"9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497\": rpc error: code = NotFound desc = could not find container \"9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497\": container with ID starting with 9ec9967a380f01ef0c04a2d3f653ff836502bec2c28bb610cd54f4b06ec17497 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.151704 4958 scope.go:117] "RemoveContainer" containerID="9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.152522 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58\": container with ID starting with 9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58 not found: ID does not exist" containerID="9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.152550 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58"} err="failed to get container status \"9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58\": rpc error: code = NotFound desc = could not find container \"9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58\": container with ID starting with 9dcf1ffd900ccf29bbca9eef654ea7ea54069b2f89d3b53faa7ac37743247f58 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.152571 4958 scope.go:117] "RemoveContainer" containerID="630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.153078 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f\": container with ID starting with 630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f not found: ID does not exist" containerID="630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.153125 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f"} err="failed to get container status \"630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f\": rpc error: code = NotFound desc = could not find container \"630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f\": container with ID starting with 630a7de865e1ec749a4744d4e19b70a190cf1583a9631f9c886e1750323a982f not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.153151 4958 scope.go:117] "RemoveContainer" containerID="bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.172421 4958 scope.go:117] "RemoveContainer" containerID="1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.207919 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5m4k"] Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.214610 4958 scope.go:117] "RemoveContainer" containerID="acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.216893 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j5m4k"] Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.242113 4958 scope.go:117] "RemoveContainer" containerID="bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.250404 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1\": container with ID starting with bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1 not found: ID does not exist" containerID="bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.250458 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1"} err="failed to get container status \"bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1\": rpc error: code = NotFound desc = could not find container \"bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1\": container with ID starting with bdc6300112db424166d36fa4dd3752e7f900cc6dac0def383c1a9af828a052e1 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.250494 4958 scope.go:117] "RemoveContainer" containerID="1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.250910 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653\": container with ID starting with 1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653 not found: ID does not exist" containerID="1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.250950 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653"} err="failed to get container status \"1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653\": rpc error: code = NotFound desc = could not find container \"1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653\": container with ID starting with 1ed0145328afa0d9d7c60acb79124a94d6022b2adc54c601380ec8cc5c048653 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.250966 4958 scope.go:117] "RemoveContainer" containerID="acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.251195 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1\": container with ID starting with acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1 not found: ID does not exist" containerID="acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.251226 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1"} err="failed to get container status \"acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1\": rpc error: code = NotFound desc = could not find container \"acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1\": container with ID starting with acdb3e271b998405aa28d104560ea30abc905b4f14607379ece280a7ae3b83a1 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.251241 4958 scope.go:117] "RemoveContainer" containerID="b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.256288 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z5z9s"] Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.260096 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z5z9s"] Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.270992 4958 scope.go:117] "RemoveContainer" containerID="55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.288707 4958 scope.go:117] "RemoveContainer" containerID="33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.312014 4958 scope.go:117] "RemoveContainer" containerID="b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.313665 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145\": container with ID starting with b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145 not found: ID does not exist" containerID="b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.313741 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145"} err="failed to get container status \"b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145\": rpc error: code = NotFound desc = could not find container \"b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145\": container with ID starting with b12639309786fed55ebfc0e78a845544721d11f42202f2c78ff26599fdd21145 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.313790 4958 scope.go:117] "RemoveContainer" containerID="55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.316306 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3\": container with ID starting with 55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3 not found: ID does not exist" containerID="55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.316367 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3"} err="failed to get container status \"55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3\": rpc error: code = NotFound desc = could not find container \"55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3\": container with ID starting with 55960e88dcf9709402ee1413e6ade89b7f5e5ce053f4a70242da2942f4fd11d3 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.316421 4958 scope.go:117] "RemoveContainer" containerID="33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.317041 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21\": container with ID starting with 33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21 not found: ID does not exist" containerID="33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.317110 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21"} err="failed to get container status \"33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21\": rpc error: code = NotFound desc = could not find container \"33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21\": container with ID starting with 33683e4659a67aec75997581589984fd2f1cbc40d6522a57863c9a9cd6aa9d21 not found: ID does not exist" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.758776 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dk9nc"] Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759116 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759133 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759144 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759152 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759163 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759170 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759180 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759185 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759193 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759199 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759206 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759212 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="extract-utilities" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759224 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759230 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759238 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759245 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759256 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759262 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759271 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759277 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759286 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759292 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="extract-content" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759299 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759306 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759313 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759320 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: E1201 10:05:53.759328 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759335 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759434 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759444 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759456 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759467 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.759473 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" containerName="registry-server" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.760122 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.782026 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dk9nc"] Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.811733 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144ccc0c-43b4-4146-a91a-b235916a6a0e" path="/var/lib/kubelet/pods/144ccc0c-43b4-4146-a91a-b235916a6a0e/volumes" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.812560 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b165b67-cdac-49bb-969f-aff516fa216d" path="/var/lib/kubelet/pods/2b165b67-cdac-49bb-969f-aff516fa216d/volumes" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.813195 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3175fa-f3c1-4377-9489-ef0fbed933bc" path="/var/lib/kubelet/pods/ba3175fa-f3c1-4377-9489-ef0fbed933bc/volumes" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.814271 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56eb44a-f6f7-4b29-be59-84e10ab7c34a" path="/var/lib/kubelet/pods/d56eb44a-f6f7-4b29-be59-84e10ab7c34a/volumes" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.815814 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" path="/var/lib/kubelet/pods/fc711edd-1026-4f47-ab7f-92bd6e1fb964/volumes" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1642367-ed3e-4753-9a9a-16b9481629b2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901127 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vn57\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-kube-api-access-8vn57\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901185 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1642367-ed3e-4753-9a9a-16b9481629b2-registry-certificates\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901608 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-registry-tls\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901798 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1642367-ed3e-4753-9a9a-16b9481629b2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.901981 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-bound-sa-token\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.902210 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1642367-ed3e-4753-9a9a-16b9481629b2-trusted-ca\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.904744 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" event={"ID":"905b7c43-8ebe-4cd3-a076-ff77e791de7b","Type":"ContainerStarted","Data":"9e722227e6bf5955a50db1b4e6d48d3aad966920adc31ab80b01b142f4a3e50c"} Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.904783 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" event={"ID":"905b7c43-8ebe-4cd3-a076-ff77e791de7b","Type":"ContainerStarted","Data":"100d8f467c211639a1e3a2992a9040819309e8c0b670487fbc1e8a7058c9f829"} Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.905408 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.913196 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.940487 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:53 crc kubenswrapper[4958]: I1201 10:05:53.952187 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-c99nd" podStartSLOduration=2.952148168 podStartE2EDuration="2.952148168s" podCreationTimestamp="2025-12-01 10:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:05:53.929897475 +0000 UTC m=+401.438686522" watchObservedRunningTime="2025-12-01 10:05:53.952148168 +0000 UTC m=+401.460937205" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.003777 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1642367-ed3e-4753-9a9a-16b9481629b2-trusted-ca\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.003928 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1642367-ed3e-4753-9a9a-16b9481629b2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.003961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vn57\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-kube-api-access-8vn57\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.004062 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1642367-ed3e-4753-9a9a-16b9481629b2-registry-certificates\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.004126 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-registry-tls\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.004171 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1642367-ed3e-4753-9a9a-16b9481629b2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.004222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-bound-sa-token\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.004826 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1642367-ed3e-4753-9a9a-16b9481629b2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.005908 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1642367-ed3e-4753-9a9a-16b9481629b2-trusted-ca\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.010266 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1642367-ed3e-4753-9a9a-16b9481629b2-registry-certificates\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.012069 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-registry-tls\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.013129 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1642367-ed3e-4753-9a9a-16b9481629b2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.024721 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vn57\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-kube-api-access-8vn57\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.025229 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1642367-ed3e-4753-9a9a-16b9481629b2-bound-sa-token\") pod \"image-registry-66df7c8f76-dk9nc\" (UID: \"c1642367-ed3e-4753-9a9a-16b9481629b2\") " pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.077732 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: W1201 10:05:54.318636 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1642367_ed3e_4753_9a9a_16b9481629b2.slice/crio-e5b4b34f0ef0b5b195fc77a501a19087843650d31173574ff8bd077ac7959617 WatchSource:0}: Error finding container e5b4b34f0ef0b5b195fc77a501a19087843650d31173574ff8bd077ac7959617: Status 404 returned error can't find the container with id e5b4b34f0ef0b5b195fc77a501a19087843650d31173574ff8bd077ac7959617 Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.320363 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dk9nc"] Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.919587 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" event={"ID":"c1642367-ed3e-4753-9a9a-16b9481629b2","Type":"ContainerStarted","Data":"9e5c7147f0dc6683f3444757140276678fc514f796c8f440793eb23412644c0f"} Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.921063 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.921137 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" event={"ID":"c1642367-ed3e-4753-9a9a-16b9481629b2","Type":"ContainerStarted","Data":"e5b4b34f0ef0b5b195fc77a501a19087843650d31173574ff8bd077ac7959617"} Dec 01 10:05:54 crc kubenswrapper[4958]: I1201 10:05:54.956487 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" podStartSLOduration=1.95645904 podStartE2EDuration="1.95645904s" podCreationTimestamp="2025-12-01 10:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:05:54.945114877 +0000 UTC m=+402.453903924" watchObservedRunningTime="2025-12-01 10:05:54.95645904 +0000 UTC m=+402.465248077" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.276943 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mmqlv"] Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.277433 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc711edd-1026-4f47-ab7f-92bd6e1fb964" containerName="marketplace-operator" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.278599 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.282105 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.302893 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmqlv"] Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.440781 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525654b6-b62b-46d0-88d0-2a94cc23278c-utilities\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.440871 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57j2m\" (UniqueName: \"kubernetes.io/projected/525654b6-b62b-46d0-88d0-2a94cc23278c-kube-api-access-57j2m\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.441027 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525654b6-b62b-46d0-88d0-2a94cc23278c-catalog-content\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.542889 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525654b6-b62b-46d0-88d0-2a94cc23278c-catalog-content\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.542990 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525654b6-b62b-46d0-88d0-2a94cc23278c-utilities\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.543027 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57j2m\" (UniqueName: \"kubernetes.io/projected/525654b6-b62b-46d0-88d0-2a94cc23278c-kube-api-access-57j2m\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.543638 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/525654b6-b62b-46d0-88d0-2a94cc23278c-catalog-content\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.543646 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/525654b6-b62b-46d0-88d0-2a94cc23278c-utilities\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.562723 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57j2m\" (UniqueName: \"kubernetes.io/projected/525654b6-b62b-46d0-88d0-2a94cc23278c-kube-api-access-57j2m\") pod \"redhat-marketplace-mmqlv\" (UID: \"525654b6-b62b-46d0-88d0-2a94cc23278c\") " pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.597252 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.676065 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmn6l"] Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.677877 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.680425 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.693140 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6l"] Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.848295 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-utilities\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.848784 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7wv\" (UniqueName: \"kubernetes.io/projected/d4b4db44-22c2-4597-8353-34af1d55f49a-kube-api-access-sv7wv\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.848881 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-catalog-content\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.951163 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-catalog-content\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.951264 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-utilities\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.951334 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7wv\" (UniqueName: \"kubernetes.io/projected/d4b4db44-22c2-4597-8353-34af1d55f49a-kube-api-access-sv7wv\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.952158 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-catalog-content\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.952268 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-utilities\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:55 crc kubenswrapper[4958]: I1201 10:05:55.978216 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7wv\" (UniqueName: \"kubernetes.io/projected/d4b4db44-22c2-4597-8353-34af1d55f49a-kube-api-access-sv7wv\") pod \"redhat-operators-hmn6l\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.003060 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.068245 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmqlv"] Dec 01 10:05:56 crc kubenswrapper[4958]: W1201 10:05:56.085605 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod525654b6_b62b_46d0_88d0_2a94cc23278c.slice/crio-d01f326e146a273d61c4c3170313538b1cb3abf4dd9f25e3eeb05a6f716ccf4b WatchSource:0}: Error finding container d01f326e146a273d61c4c3170313538b1cb3abf4dd9f25e3eeb05a6f716ccf4b: Status 404 returned error can't find the container with id d01f326e146a273d61c4c3170313538b1cb3abf4dd9f25e3eeb05a6f716ccf4b Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.499312 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6l"] Dec 01 10:05:56 crc kubenswrapper[4958]: W1201 10:05:56.513957 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4b4db44_22c2_4597_8353_34af1d55f49a.slice/crio-815e0d1cbaba30e7ff2f730304eeee7a55ccc5701c1a89a35b58beb2d8ad60c4 WatchSource:0}: Error finding container 815e0d1cbaba30e7ff2f730304eeee7a55ccc5701c1a89a35b58beb2d8ad60c4: Status 404 returned error can't find the container with id 815e0d1cbaba30e7ff2f730304eeee7a55ccc5701c1a89a35b58beb2d8ad60c4 Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.934715 4958 generic.go:334] "Generic (PLEG): container finished" podID="525654b6-b62b-46d0-88d0-2a94cc23278c" containerID="40aa6e40e50f58790a6350ad388dcc21de855ba996af9b4150426af20b3e1f83" exitCode=0 Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.934807 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmqlv" event={"ID":"525654b6-b62b-46d0-88d0-2a94cc23278c","Type":"ContainerDied","Data":"40aa6e40e50f58790a6350ad388dcc21de855ba996af9b4150426af20b3e1f83"} Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.935312 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmqlv" event={"ID":"525654b6-b62b-46d0-88d0-2a94cc23278c","Type":"ContainerStarted","Data":"d01f326e146a273d61c4c3170313538b1cb3abf4dd9f25e3eeb05a6f716ccf4b"} Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.937202 4958 generic.go:334] "Generic (PLEG): container finished" podID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerID="ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada" exitCode=0 Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.939080 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerDied","Data":"ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada"} Dec 01 10:05:56 crc kubenswrapper[4958]: I1201 10:05:56.939120 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerStarted","Data":"815e0d1cbaba30e7ff2f730304eeee7a55ccc5701c1a89a35b58beb2d8ad60c4"} Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.479541 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6kz6p"] Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.485947 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.494688 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.503354 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6kz6p"] Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.591640 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zx4\" (UniqueName: \"kubernetes.io/projected/80b9af95-d772-4612-b667-f59849b393d8-kube-api-access-b7zx4\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.591729 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-catalog-content\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.591796 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-utilities\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.693948 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zx4\" (UniqueName: \"kubernetes.io/projected/80b9af95-d772-4612-b667-f59849b393d8-kube-api-access-b7zx4\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.694029 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-catalog-content\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.694115 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-utilities\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.694644 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-utilities\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.694698 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-catalog-content\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.716765 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zx4\" (UniqueName: \"kubernetes.io/projected/80b9af95-d772-4612-b667-f59849b393d8-kube-api-access-b7zx4\") pod \"community-operators-6kz6p\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.841619 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:05:57 crc kubenswrapper[4958]: I1201 10:05:57.964092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerStarted","Data":"b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b"} Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.078998 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dr9h9"] Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.080747 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.083179 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.099088 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dr9h9"] Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.206477 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fe73a-98b3-4b4e-891e-9d75f26c64af-utilities\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.206565 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fe73a-98b3-4b4e-891e-9d75f26c64af-catalog-content\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.206622 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbkz\" (UniqueName: \"kubernetes.io/projected/888fe73a-98b3-4b4e-891e-9d75f26c64af-kube-api-access-4mbkz\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.210477 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.210552 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.286125 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6kz6p"] Dec 01 10:05:58 crc kubenswrapper[4958]: W1201 10:05:58.287901 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b9af95_d772_4612_b667_f59849b393d8.slice/crio-75589b48570ed2e9f8cc5b14088cb53f90dac47d6a0979f772c5e87fea9fab2d WatchSource:0}: Error finding container 75589b48570ed2e9f8cc5b14088cb53f90dac47d6a0979f772c5e87fea9fab2d: Status 404 returned error can't find the container with id 75589b48570ed2e9f8cc5b14088cb53f90dac47d6a0979f772c5e87fea9fab2d Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.308943 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fe73a-98b3-4b4e-891e-9d75f26c64af-utilities\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.309068 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fe73a-98b3-4b4e-891e-9d75f26c64af-catalog-content\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.309148 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbkz\" (UniqueName: \"kubernetes.io/projected/888fe73a-98b3-4b4e-891e-9d75f26c64af-kube-api-access-4mbkz\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.310537 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888fe73a-98b3-4b4e-891e-9d75f26c64af-catalog-content\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.310772 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888fe73a-98b3-4b4e-891e-9d75f26c64af-utilities\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.341296 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbkz\" (UniqueName: \"kubernetes.io/projected/888fe73a-98b3-4b4e-891e-9d75f26c64af-kube-api-access-4mbkz\") pod \"certified-operators-dr9h9\" (UID: \"888fe73a-98b3-4b4e-891e-9d75f26c64af\") " pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.413421 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.860473 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dr9h9"] Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.974439 4958 generic.go:334] "Generic (PLEG): container finished" podID="525654b6-b62b-46d0-88d0-2a94cc23278c" containerID="1f95e54b6f71968c51ee300cdc819206b396f7e11a0430cc88048c03f7c373fb" exitCode=0 Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.975059 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmqlv" event={"ID":"525654b6-b62b-46d0-88d0-2a94cc23278c","Type":"ContainerDied","Data":"1f95e54b6f71968c51ee300cdc819206b396f7e11a0430cc88048c03f7c373fb"} Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.984304 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr9h9" event={"ID":"888fe73a-98b3-4b4e-891e-9d75f26c64af","Type":"ContainerStarted","Data":"674dcc573b2475be344fc224e37b1bd089c74c7579320ddf734cd6770280c1a7"} Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.988445 4958 generic.go:334] "Generic (PLEG): container finished" podID="80b9af95-d772-4612-b667-f59849b393d8" containerID="6365746b267e7c2d78d91475add44699020f46938a58dac1772fb522b04b1730" exitCode=0 Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.988552 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerDied","Data":"6365746b267e7c2d78d91475add44699020f46938a58dac1772fb522b04b1730"} Dec 01 10:05:58 crc kubenswrapper[4958]: I1201 10:05:58.988633 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerStarted","Data":"75589b48570ed2e9f8cc5b14088cb53f90dac47d6a0979f772c5e87fea9fab2d"} Dec 01 10:05:59 crc kubenswrapper[4958]: I1201 10:05:59.000884 4958 generic.go:334] "Generic (PLEG): container finished" podID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerID="b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b" exitCode=0 Dec 01 10:05:59 crc kubenswrapper[4958]: I1201 10:05:59.001018 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerDied","Data":"b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b"} Dec 01 10:05:59 crc kubenswrapper[4958]: E1201 10:05:59.120593 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888fe73a_98b3_4b4e_891e_9d75f26c64af.slice/crio-1bb507e3d2cfbdd6ed5756f7a14567a9c72683cd702420a29f53cdf51ad0abaf.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.010091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerStarted","Data":"24f587cdd9052ac4e48b9b75e4a384d1c232d049a52c5a8e11bf73abaa11c5f4"} Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.014573 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerStarted","Data":"23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b"} Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.019050 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmqlv" event={"ID":"525654b6-b62b-46d0-88d0-2a94cc23278c","Type":"ContainerStarted","Data":"4465da543d542baf4f8935f0475c7bef574e78e834a9203f3840b139a604c0ce"} Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.021020 4958 generic.go:334] "Generic (PLEG): container finished" podID="888fe73a-98b3-4b4e-891e-9d75f26c64af" containerID="1bb507e3d2cfbdd6ed5756f7a14567a9c72683cd702420a29f53cdf51ad0abaf" exitCode=0 Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.021120 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr9h9" event={"ID":"888fe73a-98b3-4b4e-891e-9d75f26c64af","Type":"ContainerDied","Data":"1bb507e3d2cfbdd6ed5756f7a14567a9c72683cd702420a29f53cdf51ad0abaf"} Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.073657 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mmqlv" podStartSLOduration=2.367464231 podStartE2EDuration="5.073629969s" podCreationTimestamp="2025-12-01 10:05:55 +0000 UTC" firstStartedPulling="2025-12-01 10:05:56.939381422 +0000 UTC m=+404.448170459" lastFinishedPulling="2025-12-01 10:05:59.64554716 +0000 UTC m=+407.154336197" observedRunningTime="2025-12-01 10:06:00.07051735 +0000 UTC m=+407.579306397" watchObservedRunningTime="2025-12-01 10:06:00.073629969 +0000 UTC m=+407.582419006" Dec 01 10:06:00 crc kubenswrapper[4958]: I1201 10:06:00.119160 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmn6l" podStartSLOduration=2.487930428 podStartE2EDuration="5.119135833s" podCreationTimestamp="2025-12-01 10:05:55 +0000 UTC" firstStartedPulling="2025-12-01 10:05:56.939827315 +0000 UTC m=+404.448616352" lastFinishedPulling="2025-12-01 10:05:59.57103272 +0000 UTC m=+407.079821757" observedRunningTime="2025-12-01 10:06:00.094563234 +0000 UTC m=+407.603352271" watchObservedRunningTime="2025-12-01 10:06:00.119135833 +0000 UTC m=+407.627924870" Dec 01 10:06:01 crc kubenswrapper[4958]: I1201 10:06:01.029586 4958 generic.go:334] "Generic (PLEG): container finished" podID="80b9af95-d772-4612-b667-f59849b393d8" containerID="24f587cdd9052ac4e48b9b75e4a384d1c232d049a52c5a8e11bf73abaa11c5f4" exitCode=0 Dec 01 10:06:01 crc kubenswrapper[4958]: I1201 10:06:01.029797 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerDied","Data":"24f587cdd9052ac4e48b9b75e4a384d1c232d049a52c5a8e11bf73abaa11c5f4"} Dec 01 10:06:01 crc kubenswrapper[4958]: I1201 10:06:01.034233 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr9h9" event={"ID":"888fe73a-98b3-4b4e-891e-9d75f26c64af","Type":"ContainerStarted","Data":"f4a1e6d654879980f992c938806d175471e355a4e8e21cf95eede41556300271"} Dec 01 10:06:02 crc kubenswrapper[4958]: I1201 10:06:02.041958 4958 generic.go:334] "Generic (PLEG): container finished" podID="888fe73a-98b3-4b4e-891e-9d75f26c64af" containerID="f4a1e6d654879980f992c938806d175471e355a4e8e21cf95eede41556300271" exitCode=0 Dec 01 10:06:02 crc kubenswrapper[4958]: I1201 10:06:02.042058 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr9h9" event={"ID":"888fe73a-98b3-4b4e-891e-9d75f26c64af","Type":"ContainerDied","Data":"f4a1e6d654879980f992c938806d175471e355a4e8e21cf95eede41556300271"} Dec 01 10:06:03 crc kubenswrapper[4958]: I1201 10:06:03.053663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerStarted","Data":"ccf47961f5b80d62b4234e21e74266f87db3b416f3cf829012eac6d48602cf02"} Dec 01 10:06:03 crc kubenswrapper[4958]: I1201 10:06:03.080450 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6kz6p" podStartSLOduration=3.482696996 podStartE2EDuration="6.080425659s" podCreationTimestamp="2025-12-01 10:05:57 +0000 UTC" firstStartedPulling="2025-12-01 10:05:58.990530206 +0000 UTC m=+406.499319243" lastFinishedPulling="2025-12-01 10:06:01.588258869 +0000 UTC m=+409.097047906" observedRunningTime="2025-12-01 10:06:03.075593822 +0000 UTC m=+410.584382869" watchObservedRunningTime="2025-12-01 10:06:03.080425659 +0000 UTC m=+410.589214696" Dec 01 10:06:04 crc kubenswrapper[4958]: I1201 10:06:04.061828 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dr9h9" event={"ID":"888fe73a-98b3-4b4e-891e-9d75f26c64af","Type":"ContainerStarted","Data":"1b7981bf2475439dc21a70b11e151caffc7c74754a3c011118a59607fc3ddae6"} Dec 01 10:06:04 crc kubenswrapper[4958]: I1201 10:06:04.085500 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dr9h9" podStartSLOduration=3.146910622 podStartE2EDuration="6.085468302s" podCreationTimestamp="2025-12-01 10:05:58 +0000 UTC" firstStartedPulling="2025-12-01 10:06:00.022958917 +0000 UTC m=+407.531747954" lastFinishedPulling="2025-12-01 10:06:02.961516597 +0000 UTC m=+410.470305634" observedRunningTime="2025-12-01 10:06:04.08222347 +0000 UTC m=+411.591012507" watchObservedRunningTime="2025-12-01 10:06:04.085468302 +0000 UTC m=+411.594257339" Dec 01 10:06:05 crc kubenswrapper[4958]: I1201 10:06:05.597895 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:06:05 crc kubenswrapper[4958]: I1201 10:06:05.600046 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:06:05 crc kubenswrapper[4958]: I1201 10:06:05.641153 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:06:06 crc kubenswrapper[4958]: I1201 10:06:06.003947 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:06:06 crc kubenswrapper[4958]: I1201 10:06:06.004112 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:06:06 crc kubenswrapper[4958]: I1201 10:06:06.048096 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:06:06 crc kubenswrapper[4958]: I1201 10:06:06.120928 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mmqlv" Dec 01 10:06:06 crc kubenswrapper[4958]: I1201 10:06:06.123463 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:06:07 crc kubenswrapper[4958]: I1201 10:06:07.841768 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:06:07 crc kubenswrapper[4958]: I1201 10:06:07.842260 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:06:07 crc kubenswrapper[4958]: I1201 10:06:07.895198 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:06:08 crc kubenswrapper[4958]: I1201 10:06:08.139431 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:06:08 crc kubenswrapper[4958]: I1201 10:06:08.413966 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:06:08 crc kubenswrapper[4958]: I1201 10:06:08.414046 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:06:08 crc kubenswrapper[4958]: I1201 10:06:08.457939 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:06:09 crc kubenswrapper[4958]: I1201 10:06:09.147048 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dr9h9" Dec 01 10:06:14 crc kubenswrapper[4958]: I1201 10:06:14.084187 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dk9nc" Dec 01 10:06:14 crc kubenswrapper[4958]: I1201 10:06:14.147858 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwjfj"] Dec 01 10:06:28 crc kubenswrapper[4958]: I1201 10:06:28.210759 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:06:28 crc kubenswrapper[4958]: I1201 10:06:28.211794 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:06:28 crc kubenswrapper[4958]: I1201 10:06:28.211873 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:06:28 crc kubenswrapper[4958]: I1201 10:06:28.214116 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05ba5424dacdb14134224ea91b05efb8f315f1c84935a65ba19d25bc0a734c9b"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:06:28 crc kubenswrapper[4958]: I1201 10:06:28.214249 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://05ba5424dacdb14134224ea91b05efb8f315f1c84935a65ba19d25bc0a734c9b" gracePeriod=600 Dec 01 10:06:29 crc kubenswrapper[4958]: I1201 10:06:29.210932 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="05ba5424dacdb14134224ea91b05efb8f315f1c84935a65ba19d25bc0a734c9b" exitCode=0 Dec 01 10:06:29 crc kubenswrapper[4958]: I1201 10:06:29.211048 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"05ba5424dacdb14134224ea91b05efb8f315f1c84935a65ba19d25bc0a734c9b"} Dec 01 10:06:29 crc kubenswrapper[4958]: I1201 10:06:29.211367 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"bc88794b242cce63fbf1563e75610218b5fad774d5c91eba0fcffff250c9df1f"} Dec 01 10:06:29 crc kubenswrapper[4958]: I1201 10:06:29.211398 4958 scope.go:117] "RemoveContainer" containerID="2c744aaf9578ec77e74e8c51ce8af38bbde6c63fc7731c1be532d3869d57b214" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.191403 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" podUID="4fe0fe33-c2ce-4622-b6f4-3f587718b006" containerName="registry" containerID="cri-o://634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593" gracePeriod=30 Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.567761 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666638 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22zjt\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-kube-api-access-22zjt\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666707 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-trusted-ca\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666748 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-bound-sa-token\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666777 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-tls\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666819 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-certificates\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666874 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fe0fe33-c2ce-4622-b6f4-3f587718b006-ca-trust-extracted\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.666922 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fe0fe33-c2ce-4622-b6f4-3f587718b006-installation-pull-secrets\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.667071 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\" (UID: \"4fe0fe33-c2ce-4622-b6f4-3f587718b006\") " Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.668476 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.668518 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.674581 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe0fe33-c2ce-4622-b6f4-3f587718b006-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.681834 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.682214 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-kube-api-access-22zjt" (OuterVolumeSpecName: "kube-api-access-22zjt") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "kube-api-access-22zjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.682531 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.683216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.690806 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fe0fe33-c2ce-4622-b6f4-3f587718b006-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4fe0fe33-c2ce-4622-b6f4-3f587718b006" (UID: "4fe0fe33-c2ce-4622-b6f4-3f587718b006"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768749 4958 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768805 4958 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fe0fe33-c2ce-4622-b6f4-3f587718b006-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768815 4958 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fe0fe33-c2ce-4622-b6f4-3f587718b006-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768825 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22zjt\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-kube-api-access-22zjt\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768857 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fe0fe33-c2ce-4622-b6f4-3f587718b006-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768867 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:39 crc kubenswrapper[4958]: I1201 10:06:39.768876 4958 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fe0fe33-c2ce-4622-b6f4-3f587718b006-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.281069 4958 generic.go:334] "Generic (PLEG): container finished" podID="4fe0fe33-c2ce-4622-b6f4-3f587718b006" containerID="634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593" exitCode=0 Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.281184 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.281206 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" event={"ID":"4fe0fe33-c2ce-4622-b6f4-3f587718b006","Type":"ContainerDied","Data":"634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593"} Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.282919 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwjfj" event={"ID":"4fe0fe33-c2ce-4622-b6f4-3f587718b006","Type":"ContainerDied","Data":"d0426679f2990325b31969797ca22d440a92bbf788e864992e2231ed29a65081"} Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.282951 4958 scope.go:117] "RemoveContainer" containerID="634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593" Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.310221 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwjfj"] Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.312705 4958 scope.go:117] "RemoveContainer" containerID="634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593" Dec 01 10:06:40 crc kubenswrapper[4958]: E1201 10:06:40.313381 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593\": container with ID starting with 634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593 not found: ID does not exist" containerID="634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593" Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.313440 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593"} err="failed to get container status \"634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593\": rpc error: code = NotFound desc = could not find container \"634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593\": container with ID starting with 634b5a2a302180e81e8740699930bde121939e075bd7a020b0febc1c499e9593 not found: ID does not exist" Dec 01 10:06:40 crc kubenswrapper[4958]: I1201 10:06:40.313989 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwjfj"] Dec 01 10:06:41 crc kubenswrapper[4958]: I1201 10:06:41.803753 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe0fe33-c2ce-4622-b6f4-3f587718b006" path="/var/lib/kubelet/pods/4fe0fe33-c2ce-4622-b6f4-3f587718b006/volumes" Dec 01 10:08:28 crc kubenswrapper[4958]: I1201 10:08:28.210668 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:08:28 crc kubenswrapper[4958]: I1201 10:08:28.211633 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:08:58 crc kubenswrapper[4958]: I1201 10:08:58.211090 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:08:58 crc kubenswrapper[4958]: I1201 10:08:58.211805 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.210289 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.211014 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.211103 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.211989 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc88794b242cce63fbf1563e75610218b5fad774d5c91eba0fcffff250c9df1f"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.212049 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://bc88794b242cce63fbf1563e75610218b5fad774d5c91eba0fcffff250c9df1f" gracePeriod=600 Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.620865 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="bc88794b242cce63fbf1563e75610218b5fad774d5c91eba0fcffff250c9df1f" exitCode=0 Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.620878 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"bc88794b242cce63fbf1563e75610218b5fad774d5c91eba0fcffff250c9df1f"} Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.621300 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"cdfd5b8d3cf87b66034ae0097ea85e9ffd561bfa11131d8fe9310abc382db18d"} Dec 01 10:09:28 crc kubenswrapper[4958]: I1201 10:09:28.621323 4958 scope.go:117] "RemoveContainer" containerID="05ba5424dacdb14134224ea91b05efb8f315f1c84935a65ba19d25bc0a734c9b" Dec 01 10:11:28 crc kubenswrapper[4958]: I1201 10:11:28.211443 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:11:28 crc kubenswrapper[4958]: I1201 10:11:28.212203 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:11:58 crc kubenswrapper[4958]: I1201 10:11:58.210585 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:11:58 crc kubenswrapper[4958]: I1201 10:11:58.211341 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:12:01 crc kubenswrapper[4958]: I1201 10:12:01.694109 4958 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.211068 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.211791 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.211916 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.212648 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdfd5b8d3cf87b66034ae0097ea85e9ffd561bfa11131d8fe9310abc382db18d"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.212727 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://cdfd5b8d3cf87b66034ae0097ea85e9ffd561bfa11131d8fe9310abc382db18d" gracePeriod=600 Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.773619 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="cdfd5b8d3cf87b66034ae0097ea85e9ffd561bfa11131d8fe9310abc382db18d" exitCode=0 Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.773681 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"cdfd5b8d3cf87b66034ae0097ea85e9ffd561bfa11131d8fe9310abc382db18d"} Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.774521 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"5883cf8aac81784159b4994d61b5520f0d5f082b5d1aeaaa67bc8390bfe65ba5"} Dec 01 10:12:28 crc kubenswrapper[4958]: I1201 10:12:28.774565 4958 scope.go:117] "RemoveContainer" containerID="bc88794b242cce63fbf1563e75610218b5fad774d5c91eba0fcffff250c9df1f" Dec 01 10:14:28 crc kubenswrapper[4958]: I1201 10:14:28.211059 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:14:28 crc kubenswrapper[4958]: I1201 10:14:28.211807 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.476000 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-v7749"] Dec 01 10:14:36 crc kubenswrapper[4958]: E1201 10:14:36.476989 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe0fe33-c2ce-4622-b6f4-3f587718b006" containerName="registry" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.477006 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe0fe33-c2ce-4622-b6f4-3f587718b006" containerName="registry" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.477125 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe0fe33-c2ce-4622-b6f4-3f587718b006" containerName="registry" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.477658 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.480799 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.480805 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.481274 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.488157 4958 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-mnz7h" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.489103 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-v7749"] Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.672109 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckc62\" (UniqueName: \"kubernetes.io/projected/24715a95-e6a7-4208-8bff-92a10e50cb9e-kube-api-access-ckc62\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.672200 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24715a95-e6a7-4208-8bff-92a10e50cb9e-crc-storage\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.672507 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24715a95-e6a7-4208-8bff-92a10e50cb9e-node-mnt\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.773621 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24715a95-e6a7-4208-8bff-92a10e50cb9e-crc-storage\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.773871 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24715a95-e6a7-4208-8bff-92a10e50cb9e-node-mnt\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.773928 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckc62\" (UniqueName: \"kubernetes.io/projected/24715a95-e6a7-4208-8bff-92a10e50cb9e-kube-api-access-ckc62\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.774377 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24715a95-e6a7-4208-8bff-92a10e50cb9e-node-mnt\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.774902 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24715a95-e6a7-4208-8bff-92a10e50cb9e-crc-storage\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:36 crc kubenswrapper[4958]: I1201 10:14:36.797488 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckc62\" (UniqueName: \"kubernetes.io/projected/24715a95-e6a7-4208-8bff-92a10e50cb9e-kube-api-access-ckc62\") pod \"crc-storage-crc-v7749\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:37 crc kubenswrapper[4958]: I1201 10:14:37.098466 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:37 crc kubenswrapper[4958]: I1201 10:14:37.373187 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-v7749"] Dec 01 10:14:37 crc kubenswrapper[4958]: I1201 10:14:37.384280 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:14:37 crc kubenswrapper[4958]: I1201 10:14:37.550749 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-v7749" event={"ID":"24715a95-e6a7-4208-8bff-92a10e50cb9e","Type":"ContainerStarted","Data":"c87a28f1c0fb1c429285d7278dd24e745662965d4586a8c482a84b330021f36f"} Dec 01 10:14:39 crc kubenswrapper[4958]: I1201 10:14:39.567088 4958 generic.go:334] "Generic (PLEG): container finished" podID="24715a95-e6a7-4208-8bff-92a10e50cb9e" containerID="f2e9ee82e9f79caa3babbdd1502de2961277e032f8f0ecb08906a21621608a8e" exitCode=0 Dec 01 10:14:39 crc kubenswrapper[4958]: I1201 10:14:39.567199 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-v7749" event={"ID":"24715a95-e6a7-4208-8bff-92a10e50cb9e","Type":"ContainerDied","Data":"f2e9ee82e9f79caa3babbdd1502de2961277e032f8f0ecb08906a21621608a8e"} Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.799590 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.936876 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24715a95-e6a7-4208-8bff-92a10e50cb9e-node-mnt\") pod \"24715a95-e6a7-4208-8bff-92a10e50cb9e\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.936960 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckc62\" (UniqueName: \"kubernetes.io/projected/24715a95-e6a7-4208-8bff-92a10e50cb9e-kube-api-access-ckc62\") pod \"24715a95-e6a7-4208-8bff-92a10e50cb9e\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.937028 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24715a95-e6a7-4208-8bff-92a10e50cb9e-crc-storage\") pod \"24715a95-e6a7-4208-8bff-92a10e50cb9e\" (UID: \"24715a95-e6a7-4208-8bff-92a10e50cb9e\") " Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.937026 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24715a95-e6a7-4208-8bff-92a10e50cb9e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "24715a95-e6a7-4208-8bff-92a10e50cb9e" (UID: "24715a95-e6a7-4208-8bff-92a10e50cb9e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.937424 4958 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24715a95-e6a7-4208-8bff-92a10e50cb9e-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.943412 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24715a95-e6a7-4208-8bff-92a10e50cb9e-kube-api-access-ckc62" (OuterVolumeSpecName: "kube-api-access-ckc62") pod "24715a95-e6a7-4208-8bff-92a10e50cb9e" (UID: "24715a95-e6a7-4208-8bff-92a10e50cb9e"). InnerVolumeSpecName "kube-api-access-ckc62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:14:40 crc kubenswrapper[4958]: I1201 10:14:40.956229 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24715a95-e6a7-4208-8bff-92a10e50cb9e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "24715a95-e6a7-4208-8bff-92a10e50cb9e" (UID: "24715a95-e6a7-4208-8bff-92a10e50cb9e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:14:41 crc kubenswrapper[4958]: I1201 10:14:41.039193 4958 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24715a95-e6a7-4208-8bff-92a10e50cb9e-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:41 crc kubenswrapper[4958]: I1201 10:14:41.039246 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckc62\" (UniqueName: \"kubernetes.io/projected/24715a95-e6a7-4208-8bff-92a10e50cb9e-kube-api-access-ckc62\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:41 crc kubenswrapper[4958]: I1201 10:14:41.583057 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-v7749" event={"ID":"24715a95-e6a7-4208-8bff-92a10e50cb9e","Type":"ContainerDied","Data":"c87a28f1c0fb1c429285d7278dd24e745662965d4586a8c482a84b330021f36f"} Dec 01 10:14:41 crc kubenswrapper[4958]: I1201 10:14:41.583129 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-v7749" Dec 01 10:14:41 crc kubenswrapper[4958]: I1201 10:14:41.583139 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87a28f1c0fb1c429285d7278dd24e745662965d4586a8c482a84b330021f36f" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.403795 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-976fz"] Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405006 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-controller" containerID="cri-o://6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405304 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="sbdb" containerID="cri-o://c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405334 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="northd" containerID="cri-o://a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405390 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-node" containerID="cri-o://28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405445 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-acl-logging" containerID="cri-o://1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405429 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.405420 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="nbdb" containerID="cri-o://ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.475739 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" containerID="cri-o://8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" gracePeriod=30 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.597381 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/2.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.597755 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/1.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.597793 4958 generic.go:334] "Generic (PLEG): container finished" podID="46276a58-9607-4a8a-bcfc-ca41ab441ec2" containerID="d698e3b7f37fc411ed5e6e7830ba2b95824e3ac78e832cb3a588b63feb7b4f6c" exitCode=2 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.597877 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerDied","Data":"d698e3b7f37fc411ed5e6e7830ba2b95824e3ac78e832cb3a588b63feb7b4f6c"} Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.597925 4958 scope.go:117] "RemoveContainer" containerID="a6b4bf86c08b1005a58bbd3b9ef7e02e72d8650e4b71d869e9db857ac91dd20e" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.598735 4958 scope.go:117] "RemoveContainer" containerID="d698e3b7f37fc411ed5e6e7830ba2b95824e3ac78e832cb3a588b63feb7b4f6c" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.605467 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/3.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.608054 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovn-acl-logging/0.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.608687 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovn-controller/0.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609158 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" exitCode=0 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609202 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" exitCode=0 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609216 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" exitCode=143 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609231 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" exitCode=143 Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609261 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b"} Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609304 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218"} Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609319 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744"} Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.609334 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6"} Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.770057 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/3.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.773057 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovn-acl-logging/0.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.773772 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovn-controller/0.log" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.774312 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.842606 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vd762"] Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.842906 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-acl-logging" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.842925 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-acl-logging" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.842937 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-node" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.842946 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-node" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.842961 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kubecfg-setup" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.842969 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kubecfg-setup" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.842981 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="northd" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.842992 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="northd" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843009 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="nbdb" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843019 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="nbdb" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843031 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843040 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843049 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843057 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843069 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843078 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843090 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843097 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843109 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843117 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843128 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24715a95-e6a7-4208-8bff-92a10e50cb9e" containerName="storage" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843135 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="24715a95-e6a7-4208-8bff-92a10e50cb9e" containerName="storage" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843145 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="sbdb" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843153 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="sbdb" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843163 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843171 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: E1201 10:14:43.843182 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843192 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843333 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="sbdb" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843351 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="24715a95-e6a7-4208-8bff-92a10e50cb9e" containerName="storage" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843361 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843373 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843382 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="nbdb" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843393 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="northd" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843400 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843414 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-node" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843424 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovn-acl-logging" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843433 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843444 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843691 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.843706 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerName="ovnkube-controller" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.845903 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.889261 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-config\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.889715 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-kubelet\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.889867 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.889901 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.890051 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-netns\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.890135 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908015 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-systemd\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908124 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-var-lib-cni-networks-ovn-kubernetes\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908179 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-script-lib\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908215 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-etc-openvswitch\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908253 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-ovn-kubernetes\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908303 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-node-log\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908330 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpxpt\" (UniqueName: \"kubernetes.io/projected/96173cf0-4be1-4ef7-b063-4c93c1731c20-kube-api-access-rpxpt\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908353 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-env-overrides\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908409 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-netd\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908438 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-slash\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908482 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-ovn\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908503 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-openvswitch\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908536 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-bin\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908584 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovn-node-metrics-cert\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908628 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-log-socket\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908655 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-var-lib-openvswitch\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908684 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-systemd-units\") pod \"96173cf0-4be1-4ef7-b063-4c93c1731c20\" (UID: \"96173cf0-4be1-4ef7-b063-4c93c1731c20\") " Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.908967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-ovn\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909036 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-node-log\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909070 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-slash\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909128 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-systemd-units\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909174 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovn-node-metrics-cert\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909225 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-kubelet\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909239 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909255 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-env-overrides\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909340 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.910734 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.910779 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.910807 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.910834 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-node-log" (OuterVolumeSpecName: "node-log") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.911539 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.911653 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.911752 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-slash" (OuterVolumeSpecName: "host-slash") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.911864 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.909299 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-log-socket\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.912093 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-var-lib-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.912211 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-systemd\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.912309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.912508 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovnkube-script-lib\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.912630 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-cni-bin\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.912722 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.913124 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-etc-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.916541 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917045 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-run-netns\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917174 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb695\" (UniqueName: \"kubernetes.io/projected/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-kube-api-access-vb695\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917280 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-cni-netd\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917382 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovnkube-config\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917573 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917669 4958 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917736 4958 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917802 4958 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917889 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917966 4958 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918030 4958 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918104 4958 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918168 4958 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/96173cf0-4be1-4ef7-b063-4c93c1731c20-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918230 4958 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918291 4958 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918356 4958 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918423 4958 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.918492 4958 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917035 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-log-socket" (OuterVolumeSpecName: "log-socket") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917082 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.917109 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.952416 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.954237 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:14:43 crc kubenswrapper[4958]: I1201 10:14:43.957426 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96173cf0-4be1-4ef7-b063-4c93c1731c20-kube-api-access-rpxpt" (OuterVolumeSpecName: "kube-api-access-rpxpt") pod "96173cf0-4be1-4ef7-b063-4c93c1731c20" (UID: "96173cf0-4be1-4ef7-b063-4c93c1731c20"). InnerVolumeSpecName "kube-api-access-rpxpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020446 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-systemd-units\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovn-node-metrics-cert\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020557 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-kubelet\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020596 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-env-overrides\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020630 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-log-socket\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020668 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-var-lib-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020663 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-systemd-units\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020699 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-systemd\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020728 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020771 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovnkube-script-lib\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020779 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-kubelet\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020809 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-cni-bin\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020839 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-etc-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020899 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020928 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020937 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-var-lib-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020983 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-run-ovn-kubernetes\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020990 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-run-netns\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb695\" (UniqueName: \"kubernetes.io/projected/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-kube-api-access-vb695\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021070 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-cni-netd\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovnkube-config\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021153 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-ovn\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021242 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-node-log\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021294 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-slash\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021455 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-cni-bin\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021520 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-systemd\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021494 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-slash\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021467 4958 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021593 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpxpt\" (UniqueName: \"kubernetes.io/projected/96173cf0-4be1-4ef7-b063-4c93c1731c20-kube-api-access-rpxpt\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021610 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/96173cf0-4be1-4ef7-b063-4c93c1731c20-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021613 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-env-overrides\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021622 4958 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021690 4958 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021708 4958 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/96173cf0-4be1-4ef7-b063-4c93c1731c20-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.020770 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-log-socket\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021771 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021801 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-run-ovn\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021828 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-run-netns\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021961 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-host-cni-netd\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.022008 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-node-log\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021992 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovnkube-script-lib\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.021966 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-etc-openvswitch\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.022594 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovnkube-config\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.026268 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-ovn-node-metrics-cert\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.041577 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb695\" (UniqueName: \"kubernetes.io/projected/7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3-kube-api-access-vb695\") pod \"ovnkube-node-vd762\" (UID: \"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3\") " pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.161952 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:44 crc kubenswrapper[4958]: W1201 10:14:44.198105 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a48b3be_e2fd_4949_a366_2a4aa3fa9cf3.slice/crio-269b6a1134595136034eba04165101ee35f4c92c1e0b58f07b0816b6e7e7ef3e WatchSource:0}: Error finding container 269b6a1134595136034eba04165101ee35f4c92c1e0b58f07b0816b6e7e7ef3e: Status 404 returned error can't find the container with id 269b6a1134595136034eba04165101ee35f4c92c1e0b58f07b0816b6e7e7ef3e Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.617823 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7z6wb_46276a58-9607-4a8a-bcfc-ca41ab441ec2/kube-multus/2.log" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.618321 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7z6wb" event={"ID":"46276a58-9607-4a8a-bcfc-ca41ab441ec2","Type":"ContainerStarted","Data":"4673e010aae10edaf6f3c9b54109ece22c7e388015e5ab2880610ceccb8e8992"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.622120 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovnkube-controller/3.log" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.626058 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovn-acl-logging/0.log" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.626786 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-976fz_96173cf0-4be1-4ef7-b063-4c93c1731c20/ovn-controller/0.log" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627229 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" exitCode=0 Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627258 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" exitCode=0 Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627266 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" exitCode=0 Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627273 4958 generic.go:334] "Generic (PLEG): container finished" podID="96173cf0-4be1-4ef7-b063-4c93c1731c20" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" exitCode=0 Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627319 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627340 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627340 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627355 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627372 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-976fz" event={"ID":"96173cf0-4be1-4ef7-b063-4c93c1731c20","Type":"ContainerDied","Data":"e7ceaec1b97d30313c5fb972e729c451f24391d68e54738ccf7806c7abf1c830"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.627383 4958 scope.go:117] "RemoveContainer" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.629086 4958 generic.go:334] "Generic (PLEG): container finished" podID="7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3" containerID="97fc14f0f7298c7bf7d38b69a65d80e7aa24d717a7e9b1a49cf9436d517dba6c" exitCode=0 Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.629104 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerDied","Data":"97fc14f0f7298c7bf7d38b69a65d80e7aa24d717a7e9b1a49cf9436d517dba6c"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.629144 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"269b6a1134595136034eba04165101ee35f4c92c1e0b58f07b0816b6e7e7ef3e"} Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.661095 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.690690 4958 scope.go:117] "RemoveContainer" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.711758 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-976fz"] Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.716598 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-976fz"] Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.719547 4958 scope.go:117] "RemoveContainer" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.743238 4958 scope.go:117] "RemoveContainer" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.776339 4958 scope.go:117] "RemoveContainer" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.795422 4958 scope.go:117] "RemoveContainer" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.813224 4958 scope.go:117] "RemoveContainer" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.835936 4958 scope.go:117] "RemoveContainer" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.865823 4958 scope.go:117] "RemoveContainer" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.882896 4958 scope.go:117] "RemoveContainer" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.883485 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": container with ID starting with 8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435 not found: ID does not exist" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.883561 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435"} err="failed to get container status \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": rpc error: code = NotFound desc = could not find container \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": container with ID starting with 8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.883617 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.884083 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": container with ID starting with ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e not found: ID does not exist" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.884120 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e"} err="failed to get container status \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": rpc error: code = NotFound desc = could not find container \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": container with ID starting with ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.884144 4958 scope.go:117] "RemoveContainer" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.884398 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": container with ID starting with c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819 not found: ID does not exist" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.884463 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819"} err="failed to get container status \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": rpc error: code = NotFound desc = could not find container \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": container with ID starting with c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.884481 4958 scope.go:117] "RemoveContainer" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.884754 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": container with ID starting with ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592 not found: ID does not exist" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.884782 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592"} err="failed to get container status \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": rpc error: code = NotFound desc = could not find container \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": container with ID starting with ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.884803 4958 scope.go:117] "RemoveContainer" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.885262 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": container with ID starting with a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40 not found: ID does not exist" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.885290 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40"} err="failed to get container status \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": rpc error: code = NotFound desc = could not find container \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": container with ID starting with a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.885314 4958 scope.go:117] "RemoveContainer" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.885722 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": container with ID starting with 62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b not found: ID does not exist" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.885754 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b"} err="failed to get container status \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": rpc error: code = NotFound desc = could not find container \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": container with ID starting with 62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.885773 4958 scope.go:117] "RemoveContainer" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.886080 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": container with ID starting with 28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218 not found: ID does not exist" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.886108 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218"} err="failed to get container status \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": rpc error: code = NotFound desc = could not find container \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": container with ID starting with 28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.886134 4958 scope.go:117] "RemoveContainer" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.886556 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": container with ID starting with 1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744 not found: ID does not exist" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.886582 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744"} err="failed to get container status \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": rpc error: code = NotFound desc = could not find container \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": container with ID starting with 1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.886601 4958 scope.go:117] "RemoveContainer" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.886953 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": container with ID starting with 6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6 not found: ID does not exist" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.886981 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6"} err="failed to get container status \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": rpc error: code = NotFound desc = could not find container \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": container with ID starting with 6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.887000 4958 scope.go:117] "RemoveContainer" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" Dec 01 10:14:44 crc kubenswrapper[4958]: E1201 10:14:44.887372 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": container with ID starting with d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937 not found: ID does not exist" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.887403 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937"} err="failed to get container status \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": rpc error: code = NotFound desc = could not find container \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": container with ID starting with d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.887425 4958 scope.go:117] "RemoveContainer" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.887703 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435"} err="failed to get container status \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": rpc error: code = NotFound desc = could not find container \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": container with ID starting with 8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.887732 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.888086 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e"} err="failed to get container status \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": rpc error: code = NotFound desc = could not find container \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": container with ID starting with ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.888107 4958 scope.go:117] "RemoveContainer" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.888369 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819"} err="failed to get container status \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": rpc error: code = NotFound desc = could not find container \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": container with ID starting with c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.888394 4958 scope.go:117] "RemoveContainer" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.888772 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592"} err="failed to get container status \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": rpc error: code = NotFound desc = could not find container \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": container with ID starting with ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.888796 4958 scope.go:117] "RemoveContainer" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.889101 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40"} err="failed to get container status \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": rpc error: code = NotFound desc = could not find container \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": container with ID starting with a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.889126 4958 scope.go:117] "RemoveContainer" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.889490 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b"} err="failed to get container status \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": rpc error: code = NotFound desc = could not find container \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": container with ID starting with 62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.889516 4958 scope.go:117] "RemoveContainer" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.889801 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218"} err="failed to get container status \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": rpc error: code = NotFound desc = could not find container \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": container with ID starting with 28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.889827 4958 scope.go:117] "RemoveContainer" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.890253 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744"} err="failed to get container status \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": rpc error: code = NotFound desc = could not find container \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": container with ID starting with 1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.890275 4958 scope.go:117] "RemoveContainer" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.890734 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6"} err="failed to get container status \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": rpc error: code = NotFound desc = could not find container \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": container with ID starting with 6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.890790 4958 scope.go:117] "RemoveContainer" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.891513 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937"} err="failed to get container status \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": rpc error: code = NotFound desc = could not find container \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": container with ID starting with d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.891540 4958 scope.go:117] "RemoveContainer" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.891952 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435"} err="failed to get container status \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": rpc error: code = NotFound desc = could not find container \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": container with ID starting with 8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.892001 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.892370 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e"} err="failed to get container status \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": rpc error: code = NotFound desc = could not find container \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": container with ID starting with ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.892434 4958 scope.go:117] "RemoveContainer" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.892773 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819"} err="failed to get container status \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": rpc error: code = NotFound desc = could not find container \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": container with ID starting with c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.892803 4958 scope.go:117] "RemoveContainer" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.893422 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592"} err="failed to get container status \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": rpc error: code = NotFound desc = could not find container \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": container with ID starting with ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.893448 4958 scope.go:117] "RemoveContainer" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.894392 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40"} err="failed to get container status \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": rpc error: code = NotFound desc = could not find container \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": container with ID starting with a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.894512 4958 scope.go:117] "RemoveContainer" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.895051 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b"} err="failed to get container status \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": rpc error: code = NotFound desc = could not find container \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": container with ID starting with 62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.895082 4958 scope.go:117] "RemoveContainer" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.895685 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218"} err="failed to get container status \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": rpc error: code = NotFound desc = could not find container \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": container with ID starting with 28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.895735 4958 scope.go:117] "RemoveContainer" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.896178 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744"} err="failed to get container status \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": rpc error: code = NotFound desc = could not find container \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": container with ID starting with 1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.896228 4958 scope.go:117] "RemoveContainer" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.896774 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6"} err="failed to get container status \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": rpc error: code = NotFound desc = could not find container \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": container with ID starting with 6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.896814 4958 scope.go:117] "RemoveContainer" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.897142 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937"} err="failed to get container status \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": rpc error: code = NotFound desc = could not find container \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": container with ID starting with d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.897176 4958 scope.go:117] "RemoveContainer" containerID="8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.897649 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435"} err="failed to get container status \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": rpc error: code = NotFound desc = could not find container \"8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435\": container with ID starting with 8af66b386e8326429a449629e11361435b84a6e66877ac55ad82f1fdebd01435 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.897674 4958 scope.go:117] "RemoveContainer" containerID="ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.897979 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e"} err="failed to get container status \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": rpc error: code = NotFound desc = could not find container \"ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e\": container with ID starting with ccbd0df0bc30c1551d8deb0b47c85fc9ba9b155d8ea05c94cf47a4fb2d372f5e not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.898022 4958 scope.go:117] "RemoveContainer" containerID="c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.898396 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819"} err="failed to get container status \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": rpc error: code = NotFound desc = could not find container \"c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819\": container with ID starting with c1a418031c249889ebad89ad1c7eaf3249a915f2bb83f9b191f2821df2ecc819 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.898419 4958 scope.go:117] "RemoveContainer" containerID="ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.898715 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592"} err="failed to get container status \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": rpc error: code = NotFound desc = could not find container \"ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592\": container with ID starting with ee2ce26fe0dec973397348324050dde4c5731a50446f2ffb19adc6e94aa5c592 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.898749 4958 scope.go:117] "RemoveContainer" containerID="a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.899147 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40"} err="failed to get container status \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": rpc error: code = NotFound desc = could not find container \"a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40\": container with ID starting with a0f67f3edc07a5e998d5bf35da679b82f65c9e7a4982d3741bf0dc670ba39d40 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.899211 4958 scope.go:117] "RemoveContainer" containerID="62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.899553 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b"} err="failed to get container status \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": rpc error: code = NotFound desc = could not find container \"62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b\": container with ID starting with 62cf1972172b478a48f1a7f2be837af4befe5a49c821932871bedc904514058b not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.899582 4958 scope.go:117] "RemoveContainer" containerID="28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.899942 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218"} err="failed to get container status \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": rpc error: code = NotFound desc = could not find container \"28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218\": container with ID starting with 28075427e0f6da43362fae112d3a23a33401e4bb3607843b0de2e681e8e09218 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.899971 4958 scope.go:117] "RemoveContainer" containerID="1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.900305 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744"} err="failed to get container status \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": rpc error: code = NotFound desc = could not find container \"1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744\": container with ID starting with 1c43b2c6877cb90d605621783a4a561dfe17f616fdaf95764a25c749f3322744 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.900331 4958 scope.go:117] "RemoveContainer" containerID="6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.900755 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6"} err="failed to get container status \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": rpc error: code = NotFound desc = could not find container \"6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6\": container with ID starting with 6619f7be1f3dfa7a1b9ca3ed8a85d53d226b25aa040eb0bc844d714e1a8ecff6 not found: ID does not exist" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.900782 4958 scope.go:117] "RemoveContainer" containerID="d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937" Dec 01 10:14:44 crc kubenswrapper[4958]: I1201 10:14:44.901154 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937"} err="failed to get container status \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": rpc error: code = NotFound desc = could not find container \"d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937\": container with ID starting with d7e2b8f422fadbf901f7257ca93a62979ce6f138389b364d4845678a53e6a937 not found: ID does not exist" Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.642188 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"31b20c048c7910e01a84d1723667f31074f428d2eab66aa252840b4003181cb7"} Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.642656 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"6bda0c388b7c0bf8036a502dd39345d3b09b69664d528ce92a264a77fd59f7f5"} Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.642671 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"14d38ff2e9837d8461dbe7c34e44badf49587acd2b62ecee09c51c513297d7f1"} Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.642681 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"f0a26eaaefa1fbd36109f3a7597e0eab0e2f6f28aa4c41bce5094f64f37a5911"} Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.642690 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"d1476bfd80a44a550d3b69adb8afcf84b3aea32a1a9d93665759317853c3f903"} Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.642699 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"a58bcfb6d7351aa81dc9ba8f0ea17658f7c7209ec54d39653d8dfa1083ec3cc7"} Dec 01 10:14:45 crc kubenswrapper[4958]: I1201 10:14:45.806307 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96173cf0-4be1-4ef7-b063-4c93c1731c20" path="/var/lib/kubelet/pods/96173cf0-4be1-4ef7-b063-4c93c1731c20/volumes" Dec 01 10:14:47 crc kubenswrapper[4958]: I1201 10:14:47.661061 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"c5bf13889078c4b53cd2f95767e103fe49460f637ca72bde19502acff9ff5fd1"} Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.704584 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5"] Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.706219 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.709423 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.800117 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.800206 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.800249 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvrv\" (UniqueName: \"kubernetes.io/projected/d21f6ab4-e618-4de3-b662-bad657e0dd96-kube-api-access-wkvrv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.901661 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.901728 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvrv\" (UniqueName: \"kubernetes.io/projected/d21f6ab4-e618-4de3-b662-bad657e0dd96-kube-api-access-wkvrv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.901859 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.902310 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.903116 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:48 crc kubenswrapper[4958]: I1201 10:14:48.925038 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvrv\" (UniqueName: \"kubernetes.io/projected/d21f6ab4-e618-4de3-b662-bad657e0dd96-kube-api-access-wkvrv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:49 crc kubenswrapper[4958]: I1201 10:14:49.023653 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:49 crc kubenswrapper[4958]: E1201 10:14:49.048600 4958 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(b2be1c93727e85baf523ca7273dd07a44842c24d8dd57478ae00d3f7e7dac191): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:14:49 crc kubenswrapper[4958]: E1201 10:14:49.048731 4958 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(b2be1c93727e85baf523ca7273dd07a44842c24d8dd57478ae00d3f7e7dac191): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:49 crc kubenswrapper[4958]: E1201 10:14:49.048780 4958 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(b2be1c93727e85baf523ca7273dd07a44842c24d8dd57478ae00d3f7e7dac191): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:49 crc kubenswrapper[4958]: E1201 10:14:49.048904 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace(d21f6ab4-e618-4de3-b662-bad657e0dd96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace(d21f6ab4-e618-4de3-b662-bad657e0dd96)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(b2be1c93727e85baf523ca7273dd07a44842c24d8dd57478ae00d3f7e7dac191): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.683171 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" event={"ID":"7a48b3be-e2fd-4949-a366-2a4aa3fa9cf3","Type":"ContainerStarted","Data":"8e3d457be83358b24b05937d4b3e44563f430d7cae72d8fd10041485e8f9e5c7"} Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.683615 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.715765 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" podStartSLOduration=7.71573903 podStartE2EDuration="7.71573903s" podCreationTimestamp="2025-12-01 10:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:14:50.714514155 +0000 UTC m=+938.223303212" watchObservedRunningTime="2025-12-01 10:14:50.71573903 +0000 UTC m=+938.224528067" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.816730 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.877287 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-glt6k"] Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.878573 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.930279 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-utilities\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.930331 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dpl\" (UniqueName: \"kubernetes.io/projected/c5b5de43-effb-4846-a0dc-31bec473f363-kube-api-access-p4dpl\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:50 crc kubenswrapper[4958]: I1201 10:14:50.930442 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-catalog-content\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.031502 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-utilities\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.031572 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dpl\" (UniqueName: \"kubernetes.io/projected/c5b5de43-effb-4846-a0dc-31bec473f363-kube-api-access-p4dpl\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.031668 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-catalog-content\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.032195 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-utilities\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.032234 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-catalog-content\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.055389 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dpl\" (UniqueName: \"kubernetes.io/projected/c5b5de43-effb-4846-a0dc-31bec473f363-kube-api-access-p4dpl\") pod \"redhat-operators-glt6k\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.198401 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.242578 4958 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(0b95446696e8f1b9bfea0c2a869e0a23b647f834b97e0d5ba9c8df5298c66a9c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.242673 4958 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(0b95446696e8f1b9bfea0c2a869e0a23b647f834b97e0d5ba9c8df5298c66a9c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.242701 4958 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(0b95446696e8f1b9bfea0c2a869e0a23b647f834b97e0d5ba9c8df5298c66a9c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.242774 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-glt6k_openshift-marketplace(c5b5de43-effb-4846-a0dc-31bec473f363)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-glt6k_openshift-marketplace(c5b5de43-effb-4846-a0dc-31bec473f363)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(0b95446696e8f1b9bfea0c2a869e0a23b647f834b97e0d5ba9c8df5298c66a9c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-operators-glt6k" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.690676 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.690750 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.716292 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glt6k"] Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.716452 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.717005 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.739139 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5"] Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.739281 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.739931 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.742994 4958 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(987c85756904bfe9f95bdf2375d3ff92c8560e69123ce4e6b2ced5fcfbff6b6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.743058 4958 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(987c85756904bfe9f95bdf2375d3ff92c8560e69123ce4e6b2ced5fcfbff6b6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.743083 4958 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(987c85756904bfe9f95bdf2375d3ff92c8560e69123ce4e6b2ced5fcfbff6b6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.743182 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-glt6k_openshift-marketplace(c5b5de43-effb-4846-a0dc-31bec473f363)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-glt6k_openshift-marketplace(c5b5de43-effb-4846-a0dc-31bec473f363)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-glt6k_openshift-marketplace_c5b5de43-effb-4846-a0dc-31bec473f363_0(987c85756904bfe9f95bdf2375d3ff92c8560e69123ce4e6b2ced5fcfbff6b6b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-operators-glt6k" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" Dec 01 10:14:51 crc kubenswrapper[4958]: I1201 10:14:51.747143 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.776306 4958 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(a389391580105ea00719478ba1def53a3273e33d85f619df84b0135a954d5b40): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.776454 4958 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(a389391580105ea00719478ba1def53a3273e33d85f619df84b0135a954d5b40): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.776487 4958 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(a389391580105ea00719478ba1def53a3273e33d85f619df84b0135a954d5b40): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:14:51 crc kubenswrapper[4958]: E1201 10:14:51.776566 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace(d21f6ab4-e618-4de3-b662-bad657e0dd96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace(d21f6ab4-e618-4de3-b662-bad657e0dd96)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_openshift-marketplace_d21f6ab4-e618-4de3-b662-bad657e0dd96_0(a389391580105ea00719478ba1def53a3273e33d85f619df84b0135a954d5b40): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" Dec 01 10:14:58 crc kubenswrapper[4958]: I1201 10:14:58.210770 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:14:58 crc kubenswrapper[4958]: I1201 10:14:58.211604 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.173258 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp"] Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.174293 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.177757 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.177755 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.189058 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp"] Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.262797 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/195376b1-020b-42ee-8caa-6f9cefba2c55-config-volume\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.262962 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/195376b1-020b-42ee-8caa-6f9cefba2c55-secret-volume\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.263040 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9fj\" (UniqueName: \"kubernetes.io/projected/195376b1-020b-42ee-8caa-6f9cefba2c55-kube-api-access-nq9fj\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.364986 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9fj\" (UniqueName: \"kubernetes.io/projected/195376b1-020b-42ee-8caa-6f9cefba2c55-kube-api-access-nq9fj\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.365195 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/195376b1-020b-42ee-8caa-6f9cefba2c55-config-volume\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.365241 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/195376b1-020b-42ee-8caa-6f9cefba2c55-secret-volume\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.366355 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/195376b1-020b-42ee-8caa-6f9cefba2c55-config-volume\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.373387 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/195376b1-020b-42ee-8caa-6f9cefba2c55-secret-volume\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.390786 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9fj\" (UniqueName: \"kubernetes.io/projected/195376b1-020b-42ee-8caa-6f9cefba2c55-kube-api-access-nq9fj\") pod \"collect-profiles-29409735-rjwtp\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.497994 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.711883 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp"] Dec 01 10:15:00 crc kubenswrapper[4958]: I1201 10:15:00.749538 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" event={"ID":"195376b1-020b-42ee-8caa-6f9cefba2c55","Type":"ContainerStarted","Data":"c45f62b35f9e5468dabbd8d127de7033bdabe1ad505e36fd098f883ac4d44585"} Dec 01 10:15:01 crc kubenswrapper[4958]: I1201 10:15:01.758531 4958 generic.go:334] "Generic (PLEG): container finished" podID="195376b1-020b-42ee-8caa-6f9cefba2c55" containerID="a578f68bac9d99796ff49d82051d72bbdfee8c2f663053b97cd4ca98dda49375" exitCode=0 Dec 01 10:15:01 crc kubenswrapper[4958]: I1201 10:15:01.758622 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" event={"ID":"195376b1-020b-42ee-8caa-6f9cefba2c55","Type":"ContainerDied","Data":"a578f68bac9d99796ff49d82051d72bbdfee8c2f663053b97cd4ca98dda49375"} Dec 01 10:15:02 crc kubenswrapper[4958]: I1201 10:15:02.998481 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.104739 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/195376b1-020b-42ee-8caa-6f9cefba2c55-config-volume\") pod \"195376b1-020b-42ee-8caa-6f9cefba2c55\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.104826 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq9fj\" (UniqueName: \"kubernetes.io/projected/195376b1-020b-42ee-8caa-6f9cefba2c55-kube-api-access-nq9fj\") pod \"195376b1-020b-42ee-8caa-6f9cefba2c55\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.104949 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/195376b1-020b-42ee-8caa-6f9cefba2c55-secret-volume\") pod \"195376b1-020b-42ee-8caa-6f9cefba2c55\" (UID: \"195376b1-020b-42ee-8caa-6f9cefba2c55\") " Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.106149 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/195376b1-020b-42ee-8caa-6f9cefba2c55-config-volume" (OuterVolumeSpecName: "config-volume") pod "195376b1-020b-42ee-8caa-6f9cefba2c55" (UID: "195376b1-020b-42ee-8caa-6f9cefba2c55"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.110961 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195376b1-020b-42ee-8caa-6f9cefba2c55-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "195376b1-020b-42ee-8caa-6f9cefba2c55" (UID: "195376b1-020b-42ee-8caa-6f9cefba2c55"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.111011 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195376b1-020b-42ee-8caa-6f9cefba2c55-kube-api-access-nq9fj" (OuterVolumeSpecName: "kube-api-access-nq9fj") pod "195376b1-020b-42ee-8caa-6f9cefba2c55" (UID: "195376b1-020b-42ee-8caa-6f9cefba2c55"). InnerVolumeSpecName "kube-api-access-nq9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.206895 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/195376b1-020b-42ee-8caa-6f9cefba2c55-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.206968 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/195376b1-020b-42ee-8caa-6f9cefba2c55-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.206984 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq9fj\" (UniqueName: \"kubernetes.io/projected/195376b1-020b-42ee-8caa-6f9cefba2c55-kube-api-access-nq9fj\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.771688 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" event={"ID":"195376b1-020b-42ee-8caa-6f9cefba2c55","Type":"ContainerDied","Data":"c45f62b35f9e5468dabbd8d127de7033bdabe1ad505e36fd098f883ac4d44585"} Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.771753 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c45f62b35f9e5468dabbd8d127de7033bdabe1ad505e36fd098f883ac4d44585" Dec 01 10:15:03 crc kubenswrapper[4958]: I1201 10:15:03.771793 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp" Dec 01 10:15:04 crc kubenswrapper[4958]: I1201 10:15:04.796938 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:04 crc kubenswrapper[4958]: I1201 10:15:04.796999 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:15:04 crc kubenswrapper[4958]: I1201 10:15:04.797604 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:04 crc kubenswrapper[4958]: I1201 10:15:04.797610 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.084901 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5"] Dec 01 10:15:05 crc kubenswrapper[4958]: W1201 10:15:05.100120 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd21f6ab4_e618_4de3_b662_bad657e0dd96.slice/crio-7611e2168e037597f51051a2e7ceb169cd36b37f4b1a27bc87c5fa4b544639a3 WatchSource:0}: Error finding container 7611e2168e037597f51051a2e7ceb169cd36b37f4b1a27bc87c5fa4b544639a3: Status 404 returned error can't find the container with id 7611e2168e037597f51051a2e7ceb169cd36b37f4b1a27bc87c5fa4b544639a3 Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.143803 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glt6k"] Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.783572 4958 generic.go:334] "Generic (PLEG): container finished" podID="c5b5de43-effb-4846-a0dc-31bec473f363" containerID="1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f" exitCode=0 Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.783694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glt6k" event={"ID":"c5b5de43-effb-4846-a0dc-31bec473f363","Type":"ContainerDied","Data":"1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f"} Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.784143 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glt6k" event={"ID":"c5b5de43-effb-4846-a0dc-31bec473f363","Type":"ContainerStarted","Data":"81a3789813b48884d45f4b8d2d9c3f591ec78f0883c66ee0a2103ff3c20c8ddd"} Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.795862 4958 generic.go:334] "Generic (PLEG): container finished" podID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerID="324d82ea4abd3c398608898551e9036b2e321492c05ae529b87d7806aa4309fe" exitCode=0 Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.795921 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" event={"ID":"d21f6ab4-e618-4de3-b662-bad657e0dd96","Type":"ContainerDied","Data":"324d82ea4abd3c398608898551e9036b2e321492c05ae529b87d7806aa4309fe"} Dec 01 10:15:05 crc kubenswrapper[4958]: I1201 10:15:05.795967 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" event={"ID":"d21f6ab4-e618-4de3-b662-bad657e0dd96","Type":"ContainerStarted","Data":"7611e2168e037597f51051a2e7ceb169cd36b37f4b1a27bc87c5fa4b544639a3"} Dec 01 10:15:07 crc kubenswrapper[4958]: I1201 10:15:07.810938 4958 generic.go:334] "Generic (PLEG): container finished" podID="c5b5de43-effb-4846-a0dc-31bec473f363" containerID="00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7" exitCode=0 Dec 01 10:15:07 crc kubenswrapper[4958]: I1201 10:15:07.811085 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glt6k" event={"ID":"c5b5de43-effb-4846-a0dc-31bec473f363","Type":"ContainerDied","Data":"00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7"} Dec 01 10:15:07 crc kubenswrapper[4958]: I1201 10:15:07.816973 4958 generic.go:334] "Generic (PLEG): container finished" podID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerID="6e54dcc3a64496b928c71adb4afaadf85912b20f1c8c0c857bf5ddffbd2d2725" exitCode=0 Dec 01 10:15:07 crc kubenswrapper[4958]: I1201 10:15:07.817066 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" event={"ID":"d21f6ab4-e618-4de3-b662-bad657e0dd96","Type":"ContainerDied","Data":"6e54dcc3a64496b928c71adb4afaadf85912b20f1c8c0c857bf5ddffbd2d2725"} Dec 01 10:15:08 crc kubenswrapper[4958]: I1201 10:15:08.824571 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glt6k" event={"ID":"c5b5de43-effb-4846-a0dc-31bec473f363","Type":"ContainerStarted","Data":"6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85"} Dec 01 10:15:08 crc kubenswrapper[4958]: I1201 10:15:08.826669 4958 generic.go:334] "Generic (PLEG): container finished" podID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerID="e4c909c2fd7112655ce3e8fffa4427174c93036db07695a8611cc33ec298e998" exitCode=0 Dec 01 10:15:08 crc kubenswrapper[4958]: I1201 10:15:08.826730 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" event={"ID":"d21f6ab4-e618-4de3-b662-bad657e0dd96","Type":"ContainerDied","Data":"e4c909c2fd7112655ce3e8fffa4427174c93036db07695a8611cc33ec298e998"} Dec 01 10:15:08 crc kubenswrapper[4958]: I1201 10:15:08.854230 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-glt6k" podStartSLOduration=16.387256033 podStartE2EDuration="18.854195607s" podCreationTimestamp="2025-12-01 10:14:50 +0000 UTC" firstStartedPulling="2025-12-01 10:15:05.790195366 +0000 UTC m=+953.298984443" lastFinishedPulling="2025-12-01 10:15:08.25713497 +0000 UTC m=+955.765924017" observedRunningTime="2025-12-01 10:15:08.851753437 +0000 UTC m=+956.360542474" watchObservedRunningTime="2025-12-01 10:15:08.854195607 +0000 UTC m=+956.362984644" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.093044 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.210377 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-bundle\") pod \"d21f6ab4-e618-4de3-b662-bad657e0dd96\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.210498 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvrv\" (UniqueName: \"kubernetes.io/projected/d21f6ab4-e618-4de3-b662-bad657e0dd96-kube-api-access-wkvrv\") pod \"d21f6ab4-e618-4de3-b662-bad657e0dd96\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.210608 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-util\") pod \"d21f6ab4-e618-4de3-b662-bad657e0dd96\" (UID: \"d21f6ab4-e618-4de3-b662-bad657e0dd96\") " Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.211320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-bundle" (OuterVolumeSpecName: "bundle") pod "d21f6ab4-e618-4de3-b662-bad657e0dd96" (UID: "d21f6ab4-e618-4de3-b662-bad657e0dd96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.218241 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d21f6ab4-e618-4de3-b662-bad657e0dd96-kube-api-access-wkvrv" (OuterVolumeSpecName: "kube-api-access-wkvrv") pod "d21f6ab4-e618-4de3-b662-bad657e0dd96" (UID: "d21f6ab4-e618-4de3-b662-bad657e0dd96"). InnerVolumeSpecName "kube-api-access-wkvrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.224426 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-util" (OuterVolumeSpecName: "util") pod "d21f6ab4-e618-4de3-b662-bad657e0dd96" (UID: "d21f6ab4-e618-4de3-b662-bad657e0dd96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.312230 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvrv\" (UniqueName: \"kubernetes.io/projected/d21f6ab4-e618-4de3-b662-bad657e0dd96-kube-api-access-wkvrv\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.312300 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.312315 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d21f6ab4-e618-4de3-b662-bad657e0dd96-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.843263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" event={"ID":"d21f6ab4-e618-4de3-b662-bad657e0dd96","Type":"ContainerDied","Data":"7611e2168e037597f51051a2e7ceb169cd36b37f4b1a27bc87c5fa4b544639a3"} Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.843323 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7611e2168e037597f51051a2e7ceb169cd36b37f4b1a27bc87c5fa4b544639a3" Dec 01 10:15:10 crc kubenswrapper[4958]: I1201 10:15:10.843345 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5" Dec 01 10:15:11 crc kubenswrapper[4958]: I1201 10:15:11.199698 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:11 crc kubenswrapper[4958]: I1201 10:15:11.199788 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:12 crc kubenswrapper[4958]: I1201 10:15:12.247306 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-glt6k" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="registry-server" probeResult="failure" output=< Dec 01 10:15:12 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 10:15:12 crc kubenswrapper[4958]: > Dec 01 10:15:14 crc kubenswrapper[4958]: I1201 10:15:14.192293 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vd762" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.364530 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7"] Dec 01 10:15:15 crc kubenswrapper[4958]: E1201 10:15:15.364835 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="pull" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.364920 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="pull" Dec 01 10:15:15 crc kubenswrapper[4958]: E1201 10:15:15.364937 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="extract" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.364943 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="extract" Dec 01 10:15:15 crc kubenswrapper[4958]: E1201 10:15:15.364952 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195376b1-020b-42ee-8caa-6f9cefba2c55" containerName="collect-profiles" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.364958 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="195376b1-020b-42ee-8caa-6f9cefba2c55" containerName="collect-profiles" Dec 01 10:15:15 crc kubenswrapper[4958]: E1201 10:15:15.364972 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="util" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.364978 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="util" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.365078 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="195376b1-020b-42ee-8caa-6f9cefba2c55" containerName="collect-profiles" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.365091 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d21f6ab4-e618-4de3-b662-bad657e0dd96" containerName="extract" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.365590 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.367673 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tsclg" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.367748 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.368726 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.381338 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7"] Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.489800 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfl5k\" (UniqueName: \"kubernetes.io/projected/a08d88ea-e101-479e-bfc2-9b4933ef0319-kube-api-access-bfl5k\") pod \"nmstate-operator-5b5b58f5c8-59st7\" (UID: \"a08d88ea-e101-479e-bfc2-9b4933ef0319\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.591287 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfl5k\" (UniqueName: \"kubernetes.io/projected/a08d88ea-e101-479e-bfc2-9b4933ef0319-kube-api-access-bfl5k\") pod \"nmstate-operator-5b5b58f5c8-59st7\" (UID: \"a08d88ea-e101-479e-bfc2-9b4933ef0319\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.641892 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfl5k\" (UniqueName: \"kubernetes.io/projected/a08d88ea-e101-479e-bfc2-9b4933ef0319-kube-api-access-bfl5k\") pod \"nmstate-operator-5b5b58f5c8-59st7\" (UID: \"a08d88ea-e101-479e-bfc2-9b4933ef0319\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.681951 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" Dec 01 10:15:15 crc kubenswrapper[4958]: I1201 10:15:15.892675 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7"] Dec 01 10:15:16 crc kubenswrapper[4958]: I1201 10:15:16.892483 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" event={"ID":"a08d88ea-e101-479e-bfc2-9b4933ef0319","Type":"ContainerStarted","Data":"329b90fcb2093b80e5e82a37630f2954baf3a5ceb8f811b3537c7a97f9d94b6c"} Dec 01 10:15:19 crc kubenswrapper[4958]: I1201 10:15:19.917730 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" event={"ID":"a08d88ea-e101-479e-bfc2-9b4933ef0319","Type":"ContainerStarted","Data":"213ccb706a02e3c8b636d7b4bc901f07394ad1f206f9c34694869aca61a4eee7"} Dec 01 10:15:19 crc kubenswrapper[4958]: I1201 10:15:19.941169 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-59st7" podStartSLOduration=1.084561063 podStartE2EDuration="4.941143653s" podCreationTimestamp="2025-12-01 10:15:15 +0000 UTC" firstStartedPulling="2025-12-01 10:15:15.900208383 +0000 UTC m=+963.408997420" lastFinishedPulling="2025-12-01 10:15:19.756790973 +0000 UTC m=+967.265580010" observedRunningTime="2025-12-01 10:15:19.938355963 +0000 UTC m=+967.447145010" watchObservedRunningTime="2025-12-01 10:15:19.941143653 +0000 UTC m=+967.449932690" Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.869890 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bww98"] Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.871205 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.874821 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7tfct" Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.887606 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bww98"] Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.894740 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf"] Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.895825 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.898383 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.911622 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf"] Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.929657 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-csd4v"] Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.930808 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:20 crc kubenswrapper[4958]: I1201 10:15:20.970547 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8zgb\" (UniqueName: \"kubernetes.io/projected/a721f6d8-7d75-4fc8-9cc5-930e5764c7f6-kube-api-access-h8zgb\") pod \"nmstate-metrics-7f946cbc9-bww98\" (UID: \"a721f6d8-7d75-4fc8-9cc5-930e5764c7f6\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.072786 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-dbus-socket\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.072952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgsjm\" (UniqueName: \"kubernetes.io/projected/8ffef51c-a736-4393-8e14-ba5441744873-kube-api-access-tgsjm\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.073036 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ffef51c-a736-4393-8e14-ba5441744873-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.073110 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-nmstate-lock\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.073157 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-ovs-socket\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.073222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8zgb\" (UniqueName: \"kubernetes.io/projected/a721f6d8-7d75-4fc8-9cc5-930e5764c7f6-kube-api-access-h8zgb\") pod \"nmstate-metrics-7f946cbc9-bww98\" (UID: \"a721f6d8-7d75-4fc8-9cc5-930e5764c7f6\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.073250 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kk8\" (UniqueName: \"kubernetes.io/projected/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-kube-api-access-r5kk8\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.133323 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8zgb\" (UniqueName: \"kubernetes.io/projected/a721f6d8-7d75-4fc8-9cc5-930e5764c7f6-kube-api-access-h8zgb\") pod \"nmstate-metrics-7f946cbc9-bww98\" (UID: \"a721f6d8-7d75-4fc8-9cc5-930e5764c7f6\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.175770 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ffef51c-a736-4393-8e14-ba5441744873-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-nmstate-lock\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176324 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-nmstate-lock\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: E1201 10:15:21.176010 4958 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 01 10:15:21 crc kubenswrapper[4958]: E1201 10:15:21.176449 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ffef51c-a736-4393-8e14-ba5441744873-tls-key-pair podName:8ffef51c-a736-4393-8e14-ba5441744873 nodeName:}" failed. No retries permitted until 2025-12-01 10:15:21.676421202 +0000 UTC m=+969.185210409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8ffef51c-a736-4393-8e14-ba5441744873-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-78pgf" (UID: "8ffef51c-a736-4393-8e14-ba5441744873") : secret "openshift-nmstate-webhook" not found Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176353 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-ovs-socket\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176575 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-ovs-socket\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176641 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kk8\" (UniqueName: \"kubernetes.io/projected/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-kube-api-access-r5kk8\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176714 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-dbus-socket\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.176767 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgsjm\" (UniqueName: \"kubernetes.io/projected/8ffef51c-a736-4393-8e14-ba5441744873-kube-api-access-tgsjm\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.177607 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-dbus-socket\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.194241 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58"] Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.195542 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.196089 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.198239 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.199645 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.202082 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-v7tnq" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.215279 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58"] Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.225440 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgsjm\" (UniqueName: \"kubernetes.io/projected/8ffef51c-a736-4393-8e14-ba5441744873-kube-api-access-tgsjm\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.241658 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kk8\" (UniqueName: \"kubernetes.io/projected/dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a-kube-api-access-r5kk8\") pod \"nmstate-handler-csd4v\" (UID: \"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a\") " pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.266938 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.323548 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:21 crc kubenswrapper[4958]: W1201 10:15:21.350112 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc34ddf1_ab69_497f_a2e9_8ba0a328bc4a.slice/crio-4fc646ae0e25bb62838bc552f071947a99f02ea6daf52d4a75cd7efe6efff7f6 WatchSource:0}: Error finding container 4fc646ae0e25bb62838bc552f071947a99f02ea6daf52d4a75cd7efe6efff7f6: Status 404 returned error can't find the container with id 4fc646ae0e25bb62838bc552f071947a99f02ea6daf52d4a75cd7efe6efff7f6 Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.382808 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34a50f5b-9514-4292-bbc6-ed76bafccf22-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.382955 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btppt\" (UniqueName: \"kubernetes.io/projected/34a50f5b-9514-4292-bbc6-ed76bafccf22-kube-api-access-btppt\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.383037 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a50f5b-9514-4292-bbc6-ed76bafccf22-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.388360 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66c7556f7f-dqjmz"] Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.391556 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.396230 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.419045 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66c7556f7f-dqjmz"] Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.484414 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btppt\" (UniqueName: \"kubernetes.io/projected/34a50f5b-9514-4292-bbc6-ed76bafccf22-kube-api-access-btppt\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.490420 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a50f5b-9514-4292-bbc6-ed76bafccf22-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: E1201 10:15:21.490585 4958 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.490811 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34a50f5b-9514-4292-bbc6-ed76bafccf22-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: E1201 10:15:21.490937 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a50f5b-9514-4292-bbc6-ed76bafccf22-plugin-serving-cert podName:34a50f5b-9514-4292-bbc6-ed76bafccf22 nodeName:}" failed. No retries permitted until 2025-12-01 10:15:21.990836098 +0000 UTC m=+969.499625135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/34a50f5b-9514-4292-bbc6-ed76bafccf22-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-j4v58" (UID: "34a50f5b-9514-4292-bbc6-ed76bafccf22") : secret "plugin-serving-cert" not found Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.491796 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/34a50f5b-9514-4292-bbc6-ed76bafccf22-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.511689 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btppt\" (UniqueName: \"kubernetes.io/projected/34a50f5b-9514-4292-bbc6-ed76bafccf22-kube-api-access-btppt\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.593825 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-oauth-serving-cert\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.593947 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-serving-cert\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.593986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8495m\" (UniqueName: \"kubernetes.io/projected/d712bf47-e868-4060-b6c1-2de40c9f23e4-kube-api-access-8495m\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.594028 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-trusted-ca-bundle\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.594067 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-oauth-config\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.594124 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-config\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.594153 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-service-ca\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696085 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-serving-cert\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696549 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8495m\" (UniqueName: \"kubernetes.io/projected/d712bf47-e868-4060-b6c1-2de40c9f23e4-kube-api-access-8495m\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696600 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-trusted-ca-bundle\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-oauth-config\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696707 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-config\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696733 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-service-ca\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696793 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ffef51c-a736-4393-8e14-ba5441744873-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.696823 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-oauth-serving-cert\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.697772 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-config\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.697809 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-service-ca\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.698132 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-oauth-serving-cert\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.698998 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d712bf47-e868-4060-b6c1-2de40c9f23e4-trusted-ca-bundle\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.701221 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ffef51c-a736-4393-8e14-ba5441744873-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-78pgf\" (UID: \"8ffef51c-a736-4393-8e14-ba5441744873\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.701666 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-oauth-config\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.702201 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d712bf47-e868-4060-b6c1-2de40c9f23e4-console-serving-cert\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.717822 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8495m\" (UniqueName: \"kubernetes.io/projected/d712bf47-e868-4060-b6c1-2de40c9f23e4-kube-api-access-8495m\") pod \"console-66c7556f7f-dqjmz\" (UID: \"d712bf47-e868-4060-b6c1-2de40c9f23e4\") " pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.732576 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.779575 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bww98"] Dec 01 10:15:21 crc kubenswrapper[4958]: W1201 10:15:21.794255 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda721f6d8_7d75_4fc8_9cc5_930e5764c7f6.slice/crio-e4a8bf0c444b9f13030d8656ae1733daf9cd37e58ced74b855e5daec0d7932a9 WatchSource:0}: Error finding container e4a8bf0c444b9f13030d8656ae1733daf9cd37e58ced74b855e5daec0d7932a9: Status 404 returned error can't find the container with id e4a8bf0c444b9f13030d8656ae1733daf9cd37e58ced74b855e5daec0d7932a9 Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.819162 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.934632 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" event={"ID":"a721f6d8-7d75-4fc8-9cc5-930e5764c7f6","Type":"ContainerStarted","Data":"e4a8bf0c444b9f13030d8656ae1733daf9cd37e58ced74b855e5daec0d7932a9"} Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.938130 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-csd4v" event={"ID":"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a","Type":"ContainerStarted","Data":"4fc646ae0e25bb62838bc552f071947a99f02ea6daf52d4a75cd7efe6efff7f6"} Dec 01 10:15:21 crc kubenswrapper[4958]: I1201 10:15:21.986834 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66c7556f7f-dqjmz"] Dec 01 10:15:21 crc kubenswrapper[4958]: W1201 10:15:21.999337 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd712bf47_e868_4060_b6c1_2de40c9f23e4.slice/crio-03fcb5b6cd2baee6e684d82f262e9939a45f84afda071396af7eb90e598f9fb7 WatchSource:0}: Error finding container 03fcb5b6cd2baee6e684d82f262e9939a45f84afda071396af7eb90e598f9fb7: Status 404 returned error can't find the container with id 03fcb5b6cd2baee6e684d82f262e9939a45f84afda071396af7eb90e598f9fb7 Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.001755 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a50f5b-9514-4292-bbc6-ed76bafccf22-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.008944 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a50f5b-9514-4292-bbc6-ed76bafccf22-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-j4v58\" (UID: \"34a50f5b-9514-4292-bbc6-ed76bafccf22\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.121562 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf"] Dec 01 10:15:22 crc kubenswrapper[4958]: W1201 10:15:22.125257 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ffef51c_a736_4393_8e14_ba5441744873.slice/crio-f97806e558b28ee5428d482d0a59640ce86e71d722ab7c8006661334328dab5c WatchSource:0}: Error finding container f97806e558b28ee5428d482d0a59640ce86e71d722ab7c8006661334328dab5c: Status 404 returned error can't find the container with id f97806e558b28ee5428d482d0a59640ce86e71d722ab7c8006661334328dab5c Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.171390 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.372761 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58"] Dec 01 10:15:22 crc kubenswrapper[4958]: W1201 10:15:22.382566 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a50f5b_9514_4292_bbc6_ed76bafccf22.slice/crio-d86c89513c034af566c96d806fbe31e479906d4c5b91b7f3e419af281269ddf4 WatchSource:0}: Error finding container d86c89513c034af566c96d806fbe31e479906d4c5b91b7f3e419af281269ddf4: Status 404 returned error can't find the container with id d86c89513c034af566c96d806fbe31e479906d4c5b91b7f3e419af281269ddf4 Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.947886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" event={"ID":"34a50f5b-9514-4292-bbc6-ed76bafccf22","Type":"ContainerStarted","Data":"d86c89513c034af566c96d806fbe31e479906d4c5b91b7f3e419af281269ddf4"} Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.950093 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" event={"ID":"8ffef51c-a736-4393-8e14-ba5441744873","Type":"ContainerStarted","Data":"f97806e558b28ee5428d482d0a59640ce86e71d722ab7c8006661334328dab5c"} Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.951557 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c7556f7f-dqjmz" event={"ID":"d712bf47-e868-4060-b6c1-2de40c9f23e4","Type":"ContainerStarted","Data":"7c75204b203cc195ffde30f66e370550a59b5c98ab90fa387ed335a26d3e109c"} Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.951581 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c7556f7f-dqjmz" event={"ID":"d712bf47-e868-4060-b6c1-2de40c9f23e4","Type":"ContainerStarted","Data":"03fcb5b6cd2baee6e684d82f262e9939a45f84afda071396af7eb90e598f9fb7"} Dec 01 10:15:22 crc kubenswrapper[4958]: I1201 10:15:22.977781 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66c7556f7f-dqjmz" podStartSLOduration=1.9777593260000002 podStartE2EDuration="1.977759326s" podCreationTimestamp="2025-12-01 10:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:15:22.975543772 +0000 UTC m=+970.484332809" watchObservedRunningTime="2025-12-01 10:15:22.977759326 +0000 UTC m=+970.486548363" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.127555 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glt6k"] Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.127922 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-glt6k" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="registry-server" containerID="cri-o://6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85" gracePeriod=2 Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.506864 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.528204 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dpl\" (UniqueName: \"kubernetes.io/projected/c5b5de43-effb-4846-a0dc-31bec473f363-kube-api-access-p4dpl\") pod \"c5b5de43-effb-4846-a0dc-31bec473f363\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.528409 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-utilities\") pod \"c5b5de43-effb-4846-a0dc-31bec473f363\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.528490 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-catalog-content\") pod \"c5b5de43-effb-4846-a0dc-31bec473f363\" (UID: \"c5b5de43-effb-4846-a0dc-31bec473f363\") " Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.530385 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-utilities" (OuterVolumeSpecName: "utilities") pod "c5b5de43-effb-4846-a0dc-31bec473f363" (UID: "c5b5de43-effb-4846-a0dc-31bec473f363"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.538718 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b5de43-effb-4846-a0dc-31bec473f363-kube-api-access-p4dpl" (OuterVolumeSpecName: "kube-api-access-p4dpl") pod "c5b5de43-effb-4846-a0dc-31bec473f363" (UID: "c5b5de43-effb-4846-a0dc-31bec473f363"). InnerVolumeSpecName "kube-api-access-p4dpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.630192 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.630646 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dpl\" (UniqueName: \"kubernetes.io/projected/c5b5de43-effb-4846-a0dc-31bec473f363-kube-api-access-p4dpl\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.659577 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5b5de43-effb-4846-a0dc-31bec473f363" (UID: "c5b5de43-effb-4846-a0dc-31bec473f363"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.731461 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5b5de43-effb-4846-a0dc-31bec473f363-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.961920 4958 generic.go:334] "Generic (PLEG): container finished" podID="c5b5de43-effb-4846-a0dc-31bec473f363" containerID="6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85" exitCode=0 Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.961987 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glt6k" event={"ID":"c5b5de43-effb-4846-a0dc-31bec473f363","Type":"ContainerDied","Data":"6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85"} Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.962036 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glt6k" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.962068 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glt6k" event={"ID":"c5b5de43-effb-4846-a0dc-31bec473f363","Type":"ContainerDied","Data":"81a3789813b48884d45f4b8d2d9c3f591ec78f0883c66ee0a2103ff3c20c8ddd"} Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.962100 4958 scope.go:117] "RemoveContainer" containerID="6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85" Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.987283 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glt6k"] Dec 01 10:15:23 crc kubenswrapper[4958]: I1201 10:15:23.991629 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-glt6k"] Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.349622 4958 scope.go:117] "RemoveContainer" containerID="00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.413867 4958 scope.go:117] "RemoveContainer" containerID="1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.436521 4958 scope.go:117] "RemoveContainer" containerID="6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85" Dec 01 10:15:24 crc kubenswrapper[4958]: E1201 10:15:24.437238 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85\": container with ID starting with 6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85 not found: ID does not exist" containerID="6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.437307 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85"} err="failed to get container status \"6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85\": rpc error: code = NotFound desc = could not find container \"6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85\": container with ID starting with 6694444af97b6795669a1589e0cf9327b09f0a7578e5607b2a23a0d92a5eed85 not found: ID does not exist" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.437347 4958 scope.go:117] "RemoveContainer" containerID="00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7" Dec 01 10:15:24 crc kubenswrapper[4958]: E1201 10:15:24.437908 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7\": container with ID starting with 00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7 not found: ID does not exist" containerID="00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.437954 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7"} err="failed to get container status \"00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7\": rpc error: code = NotFound desc = could not find container \"00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7\": container with ID starting with 00ff4c4a40e6e36e1ddb25b8b1dcfae96a65cd9c96f9eaedd7568c006941b0a7 not found: ID does not exist" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.437986 4958 scope.go:117] "RemoveContainer" containerID="1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f" Dec 01 10:15:24 crc kubenswrapper[4958]: E1201 10:15:24.438504 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f\": container with ID starting with 1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f not found: ID does not exist" containerID="1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.438534 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f"} err="failed to get container status \"1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f\": rpc error: code = NotFound desc = could not find container \"1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f\": container with ID starting with 1839fca31f5ca51e4e23f40f3c16a65a7589b15f2fa5feb02e83fa1eedc8f45f not found: ID does not exist" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.973342 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" event={"ID":"a721f6d8-7d75-4fc8-9cc5-930e5764c7f6","Type":"ContainerStarted","Data":"958abac99acb9426de623d015f6e91883f317760e2a86fa8aca60c9e47689872"} Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.974663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" event={"ID":"8ffef51c-a736-4393-8e14-ba5441744873","Type":"ContainerStarted","Data":"4fc46a27efd1cd2f52fe039bd8f6eb2b6f7185122f98bff3f7335f85795edeca"} Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.974836 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.976757 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-csd4v" event={"ID":"dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a","Type":"ContainerStarted","Data":"6d3cf7965aef41e2c4c23b3a46528ad2af434690c4b921866510d45187d72312"} Dec 01 10:15:24 crc kubenswrapper[4958]: I1201 10:15:24.976891 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:25 crc kubenswrapper[4958]: I1201 10:15:25.001602 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" podStartSLOduration=2.660739586 podStartE2EDuration="5.00157366s" podCreationTimestamp="2025-12-01 10:15:20 +0000 UTC" firstStartedPulling="2025-12-01 10:15:22.128062732 +0000 UTC m=+969.636851769" lastFinishedPulling="2025-12-01 10:15:24.468896806 +0000 UTC m=+971.977685843" observedRunningTime="2025-12-01 10:15:24.996805106 +0000 UTC m=+972.505594133" watchObservedRunningTime="2025-12-01 10:15:25.00157366 +0000 UTC m=+972.510362697" Dec 01 10:15:25 crc kubenswrapper[4958]: I1201 10:15:25.019594 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-csd4v" podStartSLOduration=1.885443355 podStartE2EDuration="5.019562655s" podCreationTimestamp="2025-12-01 10:15:20 +0000 UTC" firstStartedPulling="2025-12-01 10:15:21.355099208 +0000 UTC m=+968.863888235" lastFinishedPulling="2025-12-01 10:15:24.489218498 +0000 UTC m=+971.998007535" observedRunningTime="2025-12-01 10:15:25.017486027 +0000 UTC m=+972.526275064" watchObservedRunningTime="2025-12-01 10:15:25.019562655 +0000 UTC m=+972.528351692" Dec 01 10:15:25 crc kubenswrapper[4958]: I1201 10:15:25.803444 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" path="/var/lib/kubelet/pods/c5b5de43-effb-4846-a0dc-31bec473f363/volumes" Dec 01 10:15:25 crc kubenswrapper[4958]: I1201 10:15:25.986103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" event={"ID":"34a50f5b-9514-4292-bbc6-ed76bafccf22","Type":"ContainerStarted","Data":"a6098abc34133c1ff7c4a0f01fd7932877e71b48919359ccfe320c5f68977421"} Dec 01 10:15:26 crc kubenswrapper[4958]: I1201 10:15:26.006669 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-j4v58" podStartSLOduration=1.7480420479999998 podStartE2EDuration="5.006645422s" podCreationTimestamp="2025-12-01 10:15:21 +0000 UTC" firstStartedPulling="2025-12-01 10:15:22.384720015 +0000 UTC m=+969.893509052" lastFinishedPulling="2025-12-01 10:15:25.643323389 +0000 UTC m=+973.152112426" observedRunningTime="2025-12-01 10:15:26.003680559 +0000 UTC m=+973.512469596" watchObservedRunningTime="2025-12-01 10:15:26.006645422 +0000 UTC m=+973.515434459" Dec 01 10:15:28 crc kubenswrapper[4958]: I1201 10:15:28.001675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" event={"ID":"a721f6d8-7d75-4fc8-9cc5-930e5764c7f6","Type":"ContainerStarted","Data":"b84ee979741b4d4ccd40c0fdf8b61332a2721b6d2cd46dd646da8a852a1c29ba"} Dec 01 10:15:28 crc kubenswrapper[4958]: I1201 10:15:28.211537 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:15:28 crc kubenswrapper[4958]: I1201 10:15:28.211891 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:15:28 crc kubenswrapper[4958]: I1201 10:15:28.212007 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:15:28 crc kubenswrapper[4958]: I1201 10:15:28.212674 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5883cf8aac81784159b4994d61b5520f0d5f082b5d1aeaaa67bc8390bfe65ba5"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:15:28 crc kubenswrapper[4958]: I1201 10:15:28.212833 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://5883cf8aac81784159b4994d61b5520f0d5f082b5d1aeaaa67bc8390bfe65ba5" gracePeriod=600 Dec 01 10:15:29 crc kubenswrapper[4958]: I1201 10:15:29.011161 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="5883cf8aac81784159b4994d61b5520f0d5f082b5d1aeaaa67bc8390bfe65ba5" exitCode=0 Dec 01 10:15:29 crc kubenswrapper[4958]: I1201 10:15:29.011227 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"5883cf8aac81784159b4994d61b5520f0d5f082b5d1aeaaa67bc8390bfe65ba5"} Dec 01 10:15:29 crc kubenswrapper[4958]: I1201 10:15:29.012154 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"269d028bc69c92127b6b1ad5b3c8a371b2530bdef6d634e5295d699f9be45b99"} Dec 01 10:15:29 crc kubenswrapper[4958]: I1201 10:15:29.012180 4958 scope.go:117] "RemoveContainer" containerID="cdfd5b8d3cf87b66034ae0097ea85e9ffd561bfa11131d8fe9310abc382db18d" Dec 01 10:15:29 crc kubenswrapper[4958]: I1201 10:15:29.032155 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bww98" podStartSLOduration=3.544685713 podStartE2EDuration="9.032124427s" podCreationTimestamp="2025-12-01 10:15:20 +0000 UTC" firstStartedPulling="2025-12-01 10:15:21.796798971 +0000 UTC m=+969.305588018" lastFinishedPulling="2025-12-01 10:15:27.284237695 +0000 UTC m=+974.793026732" observedRunningTime="2025-12-01 10:15:28.029499484 +0000 UTC m=+975.538288531" watchObservedRunningTime="2025-12-01 10:15:29.032124427 +0000 UTC m=+976.540913464" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.303744 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-csd4v" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.537376 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:15:31 crc kubenswrapper[4958]: E1201 10:15:31.537741 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="extract-utilities" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.537759 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="extract-utilities" Dec 01 10:15:31 crc kubenswrapper[4958]: E1201 10:15:31.537774 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="extract-content" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.537782 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="extract-content" Dec 01 10:15:31 crc kubenswrapper[4958]: E1201 10:15:31.537798 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="registry-server" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.537805 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="registry-server" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.537941 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b5de43-effb-4846-a0dc-31bec473f363" containerName="registry-server" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.538827 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.555067 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.650907 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-catalog-content\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.651150 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9v8w\" (UniqueName: \"kubernetes.io/projected/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-kube-api-access-l9v8w\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.651235 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-utilities\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.733612 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.733694 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.739993 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.754177 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-utilities\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.754327 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-catalog-content\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.754400 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9v8w\" (UniqueName: \"kubernetes.io/projected/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-kube-api-access-l9v8w\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.755803 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-utilities\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.756034 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-catalog-content\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.783505 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9v8w\" (UniqueName: \"kubernetes.io/projected/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-kube-api-access-l9v8w\") pod \"community-operators-b8j4b\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:31 crc kubenswrapper[4958]: I1201 10:15:31.858850 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:32 crc kubenswrapper[4958]: I1201 10:15:32.052992 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66c7556f7f-dqjmz" Dec 01 10:15:32 crc kubenswrapper[4958]: I1201 10:15:32.150115 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rxrcb"] Dec 01 10:15:32 crc kubenswrapper[4958]: I1201 10:15:32.411215 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:15:32 crc kubenswrapper[4958]: W1201 10:15:32.423833 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5f301b7_d1c9_4217_a1fe_50bc7f3f37e8.slice/crio-b5c93745c178894077d69b3186d47b8772925e7c2dce799147171b44295f70ca WatchSource:0}: Error finding container b5c93745c178894077d69b3186d47b8772925e7c2dce799147171b44295f70ca: Status 404 returned error can't find the container with id b5c93745c178894077d69b3186d47b8772925e7c2dce799147171b44295f70ca Dec 01 10:15:33 crc kubenswrapper[4958]: I1201 10:15:33.051075 4958 generic.go:334] "Generic (PLEG): container finished" podID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerID="8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a" exitCode=0 Dec 01 10:15:33 crc kubenswrapper[4958]: I1201 10:15:33.051152 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerDied","Data":"8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a"} Dec 01 10:15:33 crc kubenswrapper[4958]: I1201 10:15:33.051529 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerStarted","Data":"b5c93745c178894077d69b3186d47b8772925e7c2dce799147171b44295f70ca"} Dec 01 10:15:37 crc kubenswrapper[4958]: I1201 10:15:37.091350 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerStarted","Data":"a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03"} Dec 01 10:15:38 crc kubenswrapper[4958]: I1201 10:15:38.101405 4958 generic.go:334] "Generic (PLEG): container finished" podID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerID="a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03" exitCode=0 Dec 01 10:15:38 crc kubenswrapper[4958]: I1201 10:15:38.101539 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerDied","Data":"a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03"} Dec 01 10:15:40 crc kubenswrapper[4958]: I1201 10:15:40.121602 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerStarted","Data":"1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6"} Dec 01 10:15:40 crc kubenswrapper[4958]: I1201 10:15:40.140819 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b8j4b" podStartSLOduration=3.272483942 podStartE2EDuration="9.140792236s" podCreationTimestamp="2025-12-01 10:15:31 +0000 UTC" firstStartedPulling="2025-12-01 10:15:33.052717695 +0000 UTC m=+980.561506732" lastFinishedPulling="2025-12-01 10:15:38.921025959 +0000 UTC m=+986.429815026" observedRunningTime="2025-12-01 10:15:40.137922875 +0000 UTC m=+987.646711912" watchObservedRunningTime="2025-12-01 10:15:40.140792236 +0000 UTC m=+987.649581273" Dec 01 10:15:41 crc kubenswrapper[4958]: I1201 10:15:41.859927 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:41 crc kubenswrapper[4958]: I1201 10:15:41.860099 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:41 crc kubenswrapper[4958]: I1201 10:15:41.870039 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-78pgf" Dec 01 10:15:41 crc kubenswrapper[4958]: I1201 10:15:41.933487 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:51 crc kubenswrapper[4958]: I1201 10:15:51.913539 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:15:54 crc kubenswrapper[4958]: I1201 10:15:54.943659 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.330444 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ctx2z"] Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.334866 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.355302 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ctx2z"] Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.451240 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-utilities\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.451387 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-catalog-content\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.451482 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpq88\" (UniqueName: \"kubernetes.io/projected/2f458091-ee40-45af-b928-6470d5b6099a-kube-api-access-wpq88\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.517533 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6kz6p"] Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.517859 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6kz6p" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="registry-server" containerID="cri-o://ccf47961f5b80d62b4234e21e74266f87db3b416f3cf829012eac6d48602cf02" gracePeriod=2 Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.552474 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-catalog-content\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.552560 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpq88\" (UniqueName: \"kubernetes.io/projected/2f458091-ee40-45af-b928-6470d5b6099a-kube-api-access-wpq88\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.552592 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-utilities\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.553312 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-catalog-content\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.554118 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-utilities\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.575945 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpq88\" (UniqueName: \"kubernetes.io/projected/2f458091-ee40-45af-b928-6470d5b6099a-kube-api-access-wpq88\") pod \"certified-operators-ctx2z\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:55 crc kubenswrapper[4958]: I1201 10:15:55.661570 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.164369 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ctx2z"] Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.258008 4958 generic.go:334] "Generic (PLEG): container finished" podID="80b9af95-d772-4612-b667-f59849b393d8" containerID="ccf47961f5b80d62b4234e21e74266f87db3b416f3cf829012eac6d48602cf02" exitCode=0 Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.258087 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerDied","Data":"ccf47961f5b80d62b4234e21e74266f87db3b416f3cf829012eac6d48602cf02"} Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.265683 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerStarted","Data":"40ade14c0697c078100d9b7b15888ee13a395377a769b1d0460d53fe4ac459dc"} Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.508212 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.672094 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-catalog-content\") pod \"80b9af95-d772-4612-b667-f59849b393d8\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.672264 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7zx4\" (UniqueName: \"kubernetes.io/projected/80b9af95-d772-4612-b667-f59849b393d8-kube-api-access-b7zx4\") pod \"80b9af95-d772-4612-b667-f59849b393d8\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.672300 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-utilities\") pod \"80b9af95-d772-4612-b667-f59849b393d8\" (UID: \"80b9af95-d772-4612-b667-f59849b393d8\") " Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.673294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-utilities" (OuterVolumeSpecName: "utilities") pod "80b9af95-d772-4612-b667-f59849b393d8" (UID: "80b9af95-d772-4612-b667-f59849b393d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.678750 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b9af95-d772-4612-b667-f59849b393d8-kube-api-access-b7zx4" (OuterVolumeSpecName: "kube-api-access-b7zx4") pod "80b9af95-d772-4612-b667-f59849b393d8" (UID: "80b9af95-d772-4612-b667-f59849b393d8"). InnerVolumeSpecName "kube-api-access-b7zx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.731049 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80b9af95-d772-4612-b667-f59849b393d8" (UID: "80b9af95-d772-4612-b667-f59849b393d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.774106 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7zx4\" (UniqueName: \"kubernetes.io/projected/80b9af95-d772-4612-b667-f59849b393d8-kube-api-access-b7zx4\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.774148 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:56 crc kubenswrapper[4958]: I1201 10:15:56.774161 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80b9af95-d772-4612-b667-f59849b393d8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.198679 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rxrcb" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerName="console" containerID="cri-o://f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac" gracePeriod=15 Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.274014 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f458091-ee40-45af-b928-6470d5b6099a" containerID="de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013" exitCode=0 Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.274112 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerDied","Data":"de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013"} Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.278699 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kz6p" event={"ID":"80b9af95-d772-4612-b667-f59849b393d8","Type":"ContainerDied","Data":"75589b48570ed2e9f8cc5b14088cb53f90dac47d6a0979f772c5e87fea9fab2d"} Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.278779 4958 scope.go:117] "RemoveContainer" containerID="ccf47961f5b80d62b4234e21e74266f87db3b416f3cf829012eac6d48602cf02" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.279137 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kz6p" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.306679 4958 scope.go:117] "RemoveContainer" containerID="24f587cdd9052ac4e48b9b75e4a384d1c232d049a52c5a8e11bf73abaa11c5f4" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.328200 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6kz6p"] Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.330364 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6kz6p"] Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.341631 4958 scope.go:117] "RemoveContainer" containerID="6365746b267e7c2d78d91475add44699020f46938a58dac1772fb522b04b1730" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.519689 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rxrcb_9ee9be51-cd77-41e9-9db3-ab6a64015288/console/0.log" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.519783 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.689014 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-config\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.689140 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-oauth-config\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.689226 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-serving-cert\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.689267 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-trusted-ca-bundle\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.689375 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-oauth-serving-cert\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.689417 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xd2b\" (UniqueName: \"kubernetes.io/projected/9ee9be51-cd77-41e9-9db3-ab6a64015288-kube-api-access-4xd2b\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.690216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.690275 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-config" (OuterVolumeSpecName: "console-config") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.690938 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-service-ca\") pod \"9ee9be51-cd77-41e9-9db3-ab6a64015288\" (UID: \"9ee9be51-cd77-41e9-9db3-ab6a64015288\") " Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.690962 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.691341 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-service-ca" (OuterVolumeSpecName: "service-ca") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.691814 4958 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.691869 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.691884 4958 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.691901 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ee9be51-cd77-41e9-9db3-ab6a64015288-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.694408 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.695005 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.695213 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee9be51-cd77-41e9-9db3-ab6a64015288-kube-api-access-4xd2b" (OuterVolumeSpecName: "kube-api-access-4xd2b") pod "9ee9be51-cd77-41e9-9db3-ab6a64015288" (UID: "9ee9be51-cd77-41e9-9db3-ab6a64015288"). InnerVolumeSpecName "kube-api-access-4xd2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.793799 4958 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.793864 4958 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ee9be51-cd77-41e9-9db3-ab6a64015288-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.793876 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xd2b\" (UniqueName: \"kubernetes.io/projected/9ee9be51-cd77-41e9-9db3-ab6a64015288-kube-api-access-4xd2b\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:57 crc kubenswrapper[4958]: I1201 10:15:57.805501 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b9af95-d772-4612-b667-f59849b393d8" path="/var/lib/kubelet/pods/80b9af95-d772-4612-b667-f59849b393d8/volumes" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.172144 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7"] Dec 01 10:15:58 crc kubenswrapper[4958]: E1201 10:15:58.172862 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerName="console" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.172880 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerName="console" Dec 01 10:15:58 crc kubenswrapper[4958]: E1201 10:15:58.172903 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="extract-content" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.172910 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="extract-content" Dec 01 10:15:58 crc kubenswrapper[4958]: E1201 10:15:58.172922 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="registry-server" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.172928 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="registry-server" Dec 01 10:15:58 crc kubenswrapper[4958]: E1201 10:15:58.172939 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="extract-utilities" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.172944 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="extract-utilities" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.173058 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerName="console" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.173072 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b9af95-d772-4612-b667-f59849b393d8" containerName="registry-server" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.174183 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.179253 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.186127 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7"] Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.201289 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.201359 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7phd\" (UniqueName: \"kubernetes.io/projected/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-kube-api-access-j7phd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.201389 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.287295 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerStarted","Data":"bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb"} Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.292920 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rxrcb_9ee9be51-cd77-41e9-9db3-ab6a64015288/console/0.log" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.293000 4958 generic.go:334] "Generic (PLEG): container finished" podID="9ee9be51-cd77-41e9-9db3-ab6a64015288" containerID="f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac" exitCode=2 Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.293109 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rxrcb" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.293154 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxrcb" event={"ID":"9ee9be51-cd77-41e9-9db3-ab6a64015288","Type":"ContainerDied","Data":"f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac"} Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.293197 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rxrcb" event={"ID":"9ee9be51-cd77-41e9-9db3-ab6a64015288","Type":"ContainerDied","Data":"56d837e17e564ad76df7142bc82fce4a816f0817c1076710275c3985ad7c34e9"} Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.293220 4958 scope.go:117] "RemoveContainer" containerID="f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.302331 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.303188 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.303365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7phd\" (UniqueName: \"kubernetes.io/projected/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-kube-api-access-j7phd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.303970 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.304385 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.320171 4958 scope.go:117] "RemoveContainer" containerID="f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac" Dec 01 10:15:58 crc kubenswrapper[4958]: E1201 10:15:58.320729 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac\": container with ID starting with f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac not found: ID does not exist" containerID="f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.320772 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac"} err="failed to get container status \"f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac\": rpc error: code = NotFound desc = could not find container \"f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac\": container with ID starting with f05e77d91455a15c998803f001fcc12b9a0f41b98637e78cbb4c08cecf0b51ac not found: ID does not exist" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.338171 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7phd\" (UniqueName: \"kubernetes.io/projected/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-kube-api-access-j7phd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.347707 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rxrcb"] Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.371315 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rxrcb"] Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.489568 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:15:58 crc kubenswrapper[4958]: I1201 10:15:58.691875 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7"] Dec 01 10:15:58 crc kubenswrapper[4958]: W1201 10:15:58.699065 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe5b1e0_0e6a_407c_aa29_ca19874c86fd.slice/crio-01dfdff588b1861a770d16f8556e0f3725c9f2068a7844e1b2c616305a2302f6 WatchSource:0}: Error finding container 01dfdff588b1861a770d16f8556e0f3725c9f2068a7844e1b2c616305a2302f6: Status 404 returned error can't find the container with id 01dfdff588b1861a770d16f8556e0f3725c9f2068a7844e1b2c616305a2302f6 Dec 01 10:15:59 crc kubenswrapper[4958]: I1201 10:15:59.304910 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f458091-ee40-45af-b928-6470d5b6099a" containerID="bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb" exitCode=0 Dec 01 10:15:59 crc kubenswrapper[4958]: I1201 10:15:59.305046 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerDied","Data":"bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb"} Dec 01 10:15:59 crc kubenswrapper[4958]: I1201 10:15:59.311143 4958 generic.go:334] "Generic (PLEG): container finished" podID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerID="0a21637e45539049af1a9f91bae5722cc6362b1de5656724297d49dcecf180f7" exitCode=0 Dec 01 10:15:59 crc kubenswrapper[4958]: I1201 10:15:59.311196 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" event={"ID":"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd","Type":"ContainerDied","Data":"0a21637e45539049af1a9f91bae5722cc6362b1de5656724297d49dcecf180f7"} Dec 01 10:15:59 crc kubenswrapper[4958]: I1201 10:15:59.311225 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" event={"ID":"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd","Type":"ContainerStarted","Data":"01dfdff588b1861a770d16f8556e0f3725c9f2068a7844e1b2c616305a2302f6"} Dec 01 10:15:59 crc kubenswrapper[4958]: I1201 10:15:59.806222 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee9be51-cd77-41e9-9db3-ab6a64015288" path="/var/lib/kubelet/pods/9ee9be51-cd77-41e9-9db3-ab6a64015288/volumes" Dec 01 10:16:00 crc kubenswrapper[4958]: I1201 10:16:00.320395 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerStarted","Data":"47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3"} Dec 01 10:16:01 crc kubenswrapper[4958]: I1201 10:16:01.328694 4958 generic.go:334] "Generic (PLEG): container finished" podID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerID="ae85848dac422eb5df9e0f1c981f0768c5b3886a3a68d91dbb245d2bf58e79db" exitCode=0 Dec 01 10:16:01 crc kubenswrapper[4958]: I1201 10:16:01.328759 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" event={"ID":"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd","Type":"ContainerDied","Data":"ae85848dac422eb5df9e0f1c981f0768c5b3886a3a68d91dbb245d2bf58e79db"} Dec 01 10:16:01 crc kubenswrapper[4958]: I1201 10:16:01.349629 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ctx2z" podStartSLOduration=3.920202678 podStartE2EDuration="6.349602028s" podCreationTimestamp="2025-12-01 10:15:55 +0000 UTC" firstStartedPulling="2025-12-01 10:15:57.276431003 +0000 UTC m=+1004.785220040" lastFinishedPulling="2025-12-01 10:15:59.705830353 +0000 UTC m=+1007.214619390" observedRunningTime="2025-12-01 10:16:00.344141126 +0000 UTC m=+1007.852930163" watchObservedRunningTime="2025-12-01 10:16:01.349602028 +0000 UTC m=+1008.858391065" Dec 01 10:16:02 crc kubenswrapper[4958]: I1201 10:16:02.337661 4958 generic.go:334] "Generic (PLEG): container finished" podID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerID="205350f9276b7a342035b89c4dddc252a86ea051c176758bfb61d5a0b3023f6c" exitCode=0 Dec 01 10:16:02 crc kubenswrapper[4958]: I1201 10:16:02.337877 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" event={"ID":"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd","Type":"ContainerDied","Data":"205350f9276b7a342035b89c4dddc252a86ea051c176758bfb61d5a0b3023f6c"} Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.642005 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.790174 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7phd\" (UniqueName: \"kubernetes.io/projected/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-kube-api-access-j7phd\") pod \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.790295 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-util\") pod \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.790368 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-bundle\") pod \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\" (UID: \"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd\") " Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.791965 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-bundle" (OuterVolumeSpecName: "bundle") pod "6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" (UID: "6fe5b1e0-0e6a-407c-aa29-ca19874c86fd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.815614 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-util" (OuterVolumeSpecName: "util") pod "6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" (UID: "6fe5b1e0-0e6a-407c-aa29-ca19874c86fd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.815776 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-kube-api-access-j7phd" (OuterVolumeSpecName: "kube-api-access-j7phd") pod "6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" (UID: "6fe5b1e0-0e6a-407c-aa29-ca19874c86fd"). InnerVolumeSpecName "kube-api-access-j7phd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.892426 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.892492 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:16:03 crc kubenswrapper[4958]: I1201 10:16:03.892508 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7phd\" (UniqueName: \"kubernetes.io/projected/6fe5b1e0-0e6a-407c-aa29-ca19874c86fd-kube-api-access-j7phd\") on node \"crc\" DevicePath \"\"" Dec 01 10:16:04 crc kubenswrapper[4958]: I1201 10:16:04.352975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" event={"ID":"6fe5b1e0-0e6a-407c-aa29-ca19874c86fd","Type":"ContainerDied","Data":"01dfdff588b1861a770d16f8556e0f3725c9f2068a7844e1b2c616305a2302f6"} Dec 01 10:16:04 crc kubenswrapper[4958]: I1201 10:16:04.353325 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01dfdff588b1861a770d16f8556e0f3725c9f2068a7844e1b2c616305a2302f6" Dec 01 10:16:04 crc kubenswrapper[4958]: I1201 10:16:04.353410 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7" Dec 01 10:16:05 crc kubenswrapper[4958]: I1201 10:16:05.663260 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:16:05 crc kubenswrapper[4958]: I1201 10:16:05.663343 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:16:05 crc kubenswrapper[4958]: I1201 10:16:05.709077 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:16:06 crc kubenswrapper[4958]: I1201 10:16:06.413964 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:16:10 crc kubenswrapper[4958]: I1201 10:16:10.330396 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ctx2z"] Dec 01 10:16:10 crc kubenswrapper[4958]: I1201 10:16:10.331343 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ctx2z" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="registry-server" containerID="cri-o://47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3" gracePeriod=2 Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.336640 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.404140 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f458091-ee40-45af-b928-6470d5b6099a" containerID="47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3" exitCode=0 Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.404270 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctx2z" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.404283 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerDied","Data":"47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3"} Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.405119 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctx2z" event={"ID":"2f458091-ee40-45af-b928-6470d5b6099a","Type":"ContainerDied","Data":"40ade14c0697c078100d9b7b15888ee13a395377a769b1d0460d53fe4ac459dc"} Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.405214 4958 scope.go:117] "RemoveContainer" containerID="47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.430461 4958 scope.go:117] "RemoveContainer" containerID="bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.452998 4958 scope.go:117] "RemoveContainer" containerID="de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.470824 4958 scope.go:117] "RemoveContainer" containerID="47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3" Dec 01 10:16:11 crc kubenswrapper[4958]: E1201 10:16:11.471614 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3\": container with ID starting with 47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3 not found: ID does not exist" containerID="47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.471690 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3"} err="failed to get container status \"47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3\": rpc error: code = NotFound desc = could not find container \"47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3\": container with ID starting with 47f38390d171667b79363a929cdf68488734a9d04a26353e1e2ca604cc8c45d3 not found: ID does not exist" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.471743 4958 scope.go:117] "RemoveContainer" containerID="bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb" Dec 01 10:16:11 crc kubenswrapper[4958]: E1201 10:16:11.472424 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb\": container with ID starting with bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb not found: ID does not exist" containerID="bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.472455 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb"} err="failed to get container status \"bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb\": rpc error: code = NotFound desc = could not find container \"bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb\": container with ID starting with bb78a0af4beca9bbe6bea28e76b268b70d2254ede96b626111e0026deb01caeb not found: ID does not exist" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.472473 4958 scope.go:117] "RemoveContainer" containerID="de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013" Dec 01 10:16:11 crc kubenswrapper[4958]: E1201 10:16:11.472739 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013\": container with ID starting with de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013 not found: ID does not exist" containerID="de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.472763 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013"} err="failed to get container status \"de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013\": rpc error: code = NotFound desc = could not find container \"de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013\": container with ID starting with de3ab4f7237053144ca2ef657dabaabb43aa171a46a331f43ecc2161c28b5013 not found: ID does not exist" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.508372 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-catalog-content\") pod \"2f458091-ee40-45af-b928-6470d5b6099a\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.508470 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-utilities\") pod \"2f458091-ee40-45af-b928-6470d5b6099a\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.508503 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpq88\" (UniqueName: \"kubernetes.io/projected/2f458091-ee40-45af-b928-6470d5b6099a-kube-api-access-wpq88\") pod \"2f458091-ee40-45af-b928-6470d5b6099a\" (UID: \"2f458091-ee40-45af-b928-6470d5b6099a\") " Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.509739 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-utilities" (OuterVolumeSpecName: "utilities") pod "2f458091-ee40-45af-b928-6470d5b6099a" (UID: "2f458091-ee40-45af-b928-6470d5b6099a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.516567 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f458091-ee40-45af-b928-6470d5b6099a-kube-api-access-wpq88" (OuterVolumeSpecName: "kube-api-access-wpq88") pod "2f458091-ee40-45af-b928-6470d5b6099a" (UID: "2f458091-ee40-45af-b928-6470d5b6099a"). InnerVolumeSpecName "kube-api-access-wpq88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.550772 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f458091-ee40-45af-b928-6470d5b6099a" (UID: "2f458091-ee40-45af-b928-6470d5b6099a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.610157 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.610203 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f458091-ee40-45af-b928-6470d5b6099a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.610214 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpq88\" (UniqueName: \"kubernetes.io/projected/2f458091-ee40-45af-b928-6470d5b6099a-kube-api-access-wpq88\") on node \"crc\" DevicePath \"\"" Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.736619 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ctx2z"] Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.741540 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ctx2z"] Dec 01 10:16:11 crc kubenswrapper[4958]: I1201 10:16:11.805178 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f458091-ee40-45af-b928-6470d5b6099a" path="/var/lib/kubelet/pods/2f458091-ee40-45af-b928-6470d5b6099a/volumes" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584283 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd"] Dec 01 10:16:15 crc kubenswrapper[4958]: E1201 10:16:15.584877 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="registry-server" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584894 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="registry-server" Dec 01 10:16:15 crc kubenswrapper[4958]: E1201 10:16:15.584907 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="util" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584913 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="util" Dec 01 10:16:15 crc kubenswrapper[4958]: E1201 10:16:15.584924 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="extract-content" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584933 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="extract-content" Dec 01 10:16:15 crc kubenswrapper[4958]: E1201 10:16:15.584943 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="extract-utilities" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584949 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="extract-utilities" Dec 01 10:16:15 crc kubenswrapper[4958]: E1201 10:16:15.584959 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="extract" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584966 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="extract" Dec 01 10:16:15 crc kubenswrapper[4958]: E1201 10:16:15.584984 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="pull" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.584990 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="pull" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.585123 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f458091-ee40-45af-b928-6470d5b6099a" containerName="registry-server" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.585136 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe5b1e0-0e6a-407c-aa29-ca19874c86fd" containerName="extract" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.585664 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.588616 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.588967 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.589221 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.589838 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mzxmn" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.592550 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.610647 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd"] Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.675036 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1dc14960-b720-4b96-9635-b0ee7f2d10c4-webhook-cert\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.675094 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1dc14960-b720-4b96-9635-b0ee7f2d10c4-apiservice-cert\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.675168 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwq7\" (UniqueName: \"kubernetes.io/projected/1dc14960-b720-4b96-9635-b0ee7f2d10c4-kube-api-access-btwq7\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.776215 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwq7\" (UniqueName: \"kubernetes.io/projected/1dc14960-b720-4b96-9635-b0ee7f2d10c4-kube-api-access-btwq7\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.776320 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1dc14960-b720-4b96-9635-b0ee7f2d10c4-webhook-cert\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.776347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1dc14960-b720-4b96-9635-b0ee7f2d10c4-apiservice-cert\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.783388 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1dc14960-b720-4b96-9635-b0ee7f2d10c4-webhook-cert\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.783407 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1dc14960-b720-4b96-9635-b0ee7f2d10c4-apiservice-cert\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.796505 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwq7\" (UniqueName: \"kubernetes.io/projected/1dc14960-b720-4b96-9635-b0ee7f2d10c4-kube-api-access-btwq7\") pod \"metallb-operator-controller-manager-68d799f659-7xjvd\" (UID: \"1dc14960-b720-4b96-9635-b0ee7f2d10c4\") " pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:15 crc kubenswrapper[4958]: I1201 10:16:15.918541 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.056496 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc"] Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.057659 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.061964 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.062059 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.061967 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lks2w" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.081018 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a36305cc-8703-4b92-8632-bf39b869d4e1-apiservice-cert\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.081080 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a36305cc-8703-4b92-8632-bf39b869d4e1-webhook-cert\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.081144 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck64c\" (UniqueName: \"kubernetes.io/projected/a36305cc-8703-4b92-8632-bf39b869d4e1-kube-api-access-ck64c\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.091861 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc"] Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.182699 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck64c\" (UniqueName: \"kubernetes.io/projected/a36305cc-8703-4b92-8632-bf39b869d4e1-kube-api-access-ck64c\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.183334 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a36305cc-8703-4b92-8632-bf39b869d4e1-apiservice-cert\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.184194 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a36305cc-8703-4b92-8632-bf39b869d4e1-webhook-cert\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.189994 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a36305cc-8703-4b92-8632-bf39b869d4e1-webhook-cert\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.208813 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a36305cc-8703-4b92-8632-bf39b869d4e1-apiservice-cert\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.209016 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck64c\" (UniqueName: \"kubernetes.io/projected/a36305cc-8703-4b92-8632-bf39b869d4e1-kube-api-access-ck64c\") pod \"metallb-operator-webhook-server-67ddcd596b-s74bc\" (UID: \"a36305cc-8703-4b92-8632-bf39b869d4e1\") " pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.268546 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd"] Dec 01 10:16:16 crc kubenswrapper[4958]: W1201 10:16:16.274995 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc14960_b720_4b96_9635_b0ee7f2d10c4.slice/crio-b3bebbdb1ff2013ee83c18dae4f1696dc7adb72c032c0c6d41b7f465ecfbe1aa WatchSource:0}: Error finding container b3bebbdb1ff2013ee83c18dae4f1696dc7adb72c032c0c6d41b7f465ecfbe1aa: Status 404 returned error can't find the container with id b3bebbdb1ff2013ee83c18dae4f1696dc7adb72c032c0c6d41b7f465ecfbe1aa Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.390978 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.440564 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" event={"ID":"1dc14960-b720-4b96-9635-b0ee7f2d10c4","Type":"ContainerStarted","Data":"b3bebbdb1ff2013ee83c18dae4f1696dc7adb72c032c0c6d41b7f465ecfbe1aa"} Dec 01 10:16:16 crc kubenswrapper[4958]: I1201 10:16:16.637473 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc"] Dec 01 10:16:16 crc kubenswrapper[4958]: W1201 10:16:16.645816 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda36305cc_8703_4b92_8632_bf39b869d4e1.slice/crio-7e77b5229f40a11c22999bae7dfa77c86503c45553a0cda358202093c134690c WatchSource:0}: Error finding container 7e77b5229f40a11c22999bae7dfa77c86503c45553a0cda358202093c134690c: Status 404 returned error can't find the container with id 7e77b5229f40a11c22999bae7dfa77c86503c45553a0cda358202093c134690c Dec 01 10:16:17 crc kubenswrapper[4958]: I1201 10:16:17.448323 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" event={"ID":"a36305cc-8703-4b92-8632-bf39b869d4e1","Type":"ContainerStarted","Data":"7e77b5229f40a11c22999bae7dfa77c86503c45553a0cda358202093c134690c"} Dec 01 10:16:20 crc kubenswrapper[4958]: I1201 10:16:20.475221 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" event={"ID":"1dc14960-b720-4b96-9635-b0ee7f2d10c4","Type":"ContainerStarted","Data":"4159bde9061630e32b350f17df9a96c56ed3ae7d16dd6cff16175fae7c8b77cb"} Dec 01 10:16:20 crc kubenswrapper[4958]: I1201 10:16:20.476032 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:20 crc kubenswrapper[4958]: I1201 10:16:20.506153 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" podStartSLOduration=2.248279465 podStartE2EDuration="5.506129592s" podCreationTimestamp="2025-12-01 10:16:15 +0000 UTC" firstStartedPulling="2025-12-01 10:16:16.281833978 +0000 UTC m=+1023.790623015" lastFinishedPulling="2025-12-01 10:16:19.539684105 +0000 UTC m=+1027.048473142" observedRunningTime="2025-12-01 10:16:20.503618051 +0000 UTC m=+1028.012407088" watchObservedRunningTime="2025-12-01 10:16:20.506129592 +0000 UTC m=+1028.014918629" Dec 01 10:16:22 crc kubenswrapper[4958]: I1201 10:16:22.489608 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" event={"ID":"a36305cc-8703-4b92-8632-bf39b869d4e1","Type":"ContainerStarted","Data":"389b22ddeab12d7a84ff88449d1d1c015de3ab34c5ef060d04f63b0cbb13e254"} Dec 01 10:16:22 crc kubenswrapper[4958]: I1201 10:16:22.490157 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:22 crc kubenswrapper[4958]: I1201 10:16:22.511932 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" podStartSLOduration=1.334892908 podStartE2EDuration="6.511906753s" podCreationTimestamp="2025-12-01 10:16:16 +0000 UTC" firstStartedPulling="2025-12-01 10:16:16.658003182 +0000 UTC m=+1024.166792209" lastFinishedPulling="2025-12-01 10:16:21.835017017 +0000 UTC m=+1029.343806054" observedRunningTime="2025-12-01 10:16:22.508750064 +0000 UTC m=+1030.017539101" watchObservedRunningTime="2025-12-01 10:16:22.511906753 +0000 UTC m=+1030.020695790" Dec 01 10:16:36 crc kubenswrapper[4958]: I1201 10:16:36.399162 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-67ddcd596b-s74bc" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.560328 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqfhn"] Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.562794 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.585459 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-utilities\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.585521 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-catalog-content\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.585563 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjw6\" (UniqueName: \"kubernetes.io/projected/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-kube-api-access-gxjw6\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.588701 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqfhn"] Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.686679 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-catalog-content\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.686775 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjw6\" (UniqueName: \"kubernetes.io/projected/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-kube-api-access-gxjw6\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.686859 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-utilities\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.687510 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-catalog-content\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.687542 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-utilities\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.713956 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjw6\" (UniqueName: \"kubernetes.io/projected/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-kube-api-access-gxjw6\") pod \"redhat-marketplace-lqfhn\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:54 crc kubenswrapper[4958]: I1201 10:16:54.885026 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:16:55 crc kubenswrapper[4958]: I1201 10:16:55.121187 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqfhn"] Dec 01 10:16:55 crc kubenswrapper[4958]: I1201 10:16:55.727158 4958 generic.go:334] "Generic (PLEG): container finished" podID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerID="d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8" exitCode=0 Dec 01 10:16:55 crc kubenswrapper[4958]: I1201 10:16:55.727284 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerDied","Data":"d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8"} Dec 01 10:16:55 crc kubenswrapper[4958]: I1201 10:16:55.727714 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerStarted","Data":"7356f8e8d95275f0ab3f21cc375ca97393c6ae0898e86dbdd72333bdc0087b1b"} Dec 01 10:16:55 crc kubenswrapper[4958]: I1201 10:16:55.921795 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-68d799f659-7xjvd" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.683282 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5gghz"] Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.686253 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.694978 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq"] Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.695986 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.696934 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.697231 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.697433 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-sm2lg" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.703459 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq"] Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.706889 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734582 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-frr-sockets\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734638 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hk8\" (UniqueName: \"kubernetes.io/projected/925d86c1-9ff9-4290-a32b-142cdd162300-kube-api-access-z7hk8\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734678 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-frr-conf\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734704 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-reloader\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734733 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/925d86c1-9ff9-4290-a32b-142cdd162300-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734758 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4deb5735-ce1d-4302-9dad-783a64a13380-metrics-certs\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734774 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4deb5735-ce1d-4302-9dad-783a64a13380-frr-startup\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsc42\" (UniqueName: \"kubernetes.io/projected/4deb5735-ce1d-4302-9dad-783a64a13380-kube-api-access-zsc42\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.734823 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-metrics\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.742163 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerStarted","Data":"a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824"} Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.810137 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cd79p"] Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.811362 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.813934 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-l7chh" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.814188 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.814360 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.814497 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.819073 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-vmp79"] Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.820221 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.829421 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.836877 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vmp79"] Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.849403 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4deb5735-ce1d-4302-9dad-783a64a13380-metrics-certs\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.849450 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4deb5735-ce1d-4302-9dad-783a64a13380-frr-startup\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850429 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsc42\" (UniqueName: \"kubernetes.io/projected/4deb5735-ce1d-4302-9dad-783a64a13380-kube-api-access-zsc42\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850463 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-metrics-certs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850496 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-metrics\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-frr-sockets\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850568 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswxs\" (UniqueName: \"kubernetes.io/projected/48822f3f-9799-45f5-9551-e2ac6966c560-kube-api-access-sswxs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850591 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hk8\" (UniqueName: \"kubernetes.io/projected/925d86c1-9ff9-4290-a32b-142cdd162300-kube-api-access-z7hk8\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850621 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850934 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-metrics\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.850385 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4deb5735-ce1d-4302-9dad-783a64a13380-frr-startup\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.851469 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-frr-sockets\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.856287 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-frr-conf\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.856720 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/48822f3f-9799-45f5-9551-e2ac6966c560-metallb-excludel2\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.856752 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-reloader\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.856792 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/925d86c1-9ff9-4290-a32b-142cdd162300-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:56 crc kubenswrapper[4958]: E1201 10:16:56.856996 4958 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 10:16:56 crc kubenswrapper[4958]: E1201 10:16:56.857050 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/925d86c1-9ff9-4290-a32b-142cdd162300-cert podName:925d86c1-9ff9-4290-a32b-142cdd162300 nodeName:}" failed. No retries permitted until 2025-12-01 10:16:57.357031262 +0000 UTC m=+1064.865820299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/925d86c1-9ff9-4290-a32b-142cdd162300-cert") pod "frr-k8s-webhook-server-7fcb986d4-clzbq" (UID: "925d86c1-9ff9-4290-a32b-142cdd162300") : secret "frr-k8s-webhook-server-cert" not found Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.856614 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-frr-conf\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.857265 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4deb5735-ce1d-4302-9dad-783a64a13380-reloader\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.863379 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4deb5735-ce1d-4302-9dad-783a64a13380-metrics-certs\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.872567 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hk8\" (UniqueName: \"kubernetes.io/projected/925d86c1-9ff9-4290-a32b-142cdd162300-kube-api-access-z7hk8\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.882151 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsc42\" (UniqueName: \"kubernetes.io/projected/4deb5735-ce1d-4302-9dad-783a64a13380-kube-api-access-zsc42\") pod \"frr-k8s-5gghz\" (UID: \"4deb5735-ce1d-4302-9dad-783a64a13380\") " pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958344 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswxs\" (UniqueName: \"kubernetes.io/projected/48822f3f-9799-45f5-9551-e2ac6966c560-kube-api-access-sswxs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958637 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958702 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7996247d-b6a6-4433-95e8-21490189c1dd-cert\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958725 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbll7\" (UniqueName: \"kubernetes.io/projected/7996247d-b6a6-4433-95e8-21490189c1dd-kube-api-access-bbll7\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958779 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/48822f3f-9799-45f5-9551-e2ac6966c560-metallb-excludel2\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958898 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-metrics-certs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.958939 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7996247d-b6a6-4433-95e8-21490189c1dd-metrics-certs\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:56 crc kubenswrapper[4958]: E1201 10:16:56.958944 4958 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 10:16:56 crc kubenswrapper[4958]: E1201 10:16:56.959055 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist podName:48822f3f-9799-45f5-9551-e2ac6966c560 nodeName:}" failed. No retries permitted until 2025-12-01 10:16:57.459025819 +0000 UTC m=+1064.967814856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist") pod "speaker-cd79p" (UID: "48822f3f-9799-45f5-9551-e2ac6966c560") : secret "metallb-memberlist" not found Dec 01 10:16:56 crc kubenswrapper[4958]: E1201 10:16:56.959253 4958 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 01 10:16:56 crc kubenswrapper[4958]: E1201 10:16:56.959349 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-metrics-certs podName:48822f3f-9799-45f5-9551-e2ac6966c560 nodeName:}" failed. No retries permitted until 2025-12-01 10:16:57.459326098 +0000 UTC m=+1064.968115135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-metrics-certs") pod "speaker-cd79p" (UID: "48822f3f-9799-45f5-9551-e2ac6966c560") : secret "speaker-certs-secret" not found Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.959975 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/48822f3f-9799-45f5-9551-e2ac6966c560-metallb-excludel2\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:56 crc kubenswrapper[4958]: I1201 10:16:56.984300 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswxs\" (UniqueName: \"kubernetes.io/projected/48822f3f-9799-45f5-9551-e2ac6966c560-kube-api-access-sswxs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.012725 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5gghz" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.060145 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7996247d-b6a6-4433-95e8-21490189c1dd-cert\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.060228 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbll7\" (UniqueName: \"kubernetes.io/projected/7996247d-b6a6-4433-95e8-21490189c1dd-kube-api-access-bbll7\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.060316 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7996247d-b6a6-4433-95e8-21490189c1dd-metrics-certs\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.066765 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7996247d-b6a6-4433-95e8-21490189c1dd-cert\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.066826 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7996247d-b6a6-4433-95e8-21490189c1dd-metrics-certs\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.082734 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbll7\" (UniqueName: \"kubernetes.io/projected/7996247d-b6a6-4433-95e8-21490189c1dd-kube-api-access-bbll7\") pod \"controller-f8648f98b-vmp79\" (UID: \"7996247d-b6a6-4433-95e8-21490189c1dd\") " pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.150024 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.389704 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/925d86c1-9ff9-4290-a32b-142cdd162300-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.396878 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/925d86c1-9ff9-4290-a32b-142cdd162300-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-clzbq\" (UID: \"925d86c1-9ff9-4290-a32b-142cdd162300\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.424251 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vmp79"] Dec 01 10:16:57 crc kubenswrapper[4958]: W1201 10:16:57.431586 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7996247d_b6a6_4433_95e8_21490189c1dd.slice/crio-8d29022084591712cd21a8ae8c5ea60da1bc6fdb4645fb79d821a9e66e2e4042 WatchSource:0}: Error finding container 8d29022084591712cd21a8ae8c5ea60da1bc6fdb4645fb79d821a9e66e2e4042: Status 404 returned error can't find the container with id 8d29022084591712cd21a8ae8c5ea60da1bc6fdb4645fb79d821a9e66e2e4042 Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.492141 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:57 crc kubenswrapper[4958]: E1201 10:16:57.492331 4958 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 10:16:57 crc kubenswrapper[4958]: E1201 10:16:57.492462 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist podName:48822f3f-9799-45f5-9551-e2ac6966c560 nodeName:}" failed. No retries permitted until 2025-12-01 10:16:58.492434724 +0000 UTC m=+1066.001223921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist") pod "speaker-cd79p" (UID: "48822f3f-9799-45f5-9551-e2ac6966c560") : secret "metallb-memberlist" not found Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.492348 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-metrics-certs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.499616 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-metrics-certs\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.622905 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.755926 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vmp79" event={"ID":"7996247d-b6a6-4433-95e8-21490189c1dd","Type":"ContainerStarted","Data":"41822c35f5a877b4d485514adb0c5c14d36e17e6729ceabe43776b6e06e974e5"} Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.755985 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vmp79" event={"ID":"7996247d-b6a6-4433-95e8-21490189c1dd","Type":"ContainerStarted","Data":"a23f574bcd1410f72c25eb8fe04cc8f846536ec379319896a627176ec460a7c0"} Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.756000 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vmp79" event={"ID":"7996247d-b6a6-4433-95e8-21490189c1dd","Type":"ContainerStarted","Data":"8d29022084591712cd21a8ae8c5ea60da1bc6fdb4645fb79d821a9e66e2e4042"} Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.756109 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.757548 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"1795ef2862b11233585fad000edf3349d2099af15b25d404b29839bb913bc08e"} Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.761126 4958 generic.go:334] "Generic (PLEG): container finished" podID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerID="a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824" exitCode=0 Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.761188 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerDied","Data":"a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824"} Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.788340 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-vmp79" podStartSLOduration=1.78831709 podStartE2EDuration="1.78831709s" podCreationTimestamp="2025-12-01 10:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:16:57.780210872 +0000 UTC m=+1065.288999909" watchObservedRunningTime="2025-12-01 10:16:57.78831709 +0000 UTC m=+1065.297106127" Dec 01 10:16:57 crc kubenswrapper[4958]: I1201 10:16:57.892930 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq"] Dec 01 10:16:58 crc kubenswrapper[4958]: I1201 10:16:58.518452 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:58 crc kubenswrapper[4958]: I1201 10:16:58.525186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48822f3f-9799-45f5-9551-e2ac6966c560-memberlist\") pod \"speaker-cd79p\" (UID: \"48822f3f-9799-45f5-9551-e2ac6966c560\") " pod="metallb-system/speaker-cd79p" Dec 01 10:16:58 crc kubenswrapper[4958]: I1201 10:16:58.631003 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cd79p" Dec 01 10:16:58 crc kubenswrapper[4958]: W1201 10:16:58.655106 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48822f3f_9799_45f5_9551_e2ac6966c560.slice/crio-815820bb8d1d3e557849e003052e9b4899ec21e38d5751e489945727d240ccc2 WatchSource:0}: Error finding container 815820bb8d1d3e557849e003052e9b4899ec21e38d5751e489945727d240ccc2: Status 404 returned error can't find the container with id 815820bb8d1d3e557849e003052e9b4899ec21e38d5751e489945727d240ccc2 Dec 01 10:16:58 crc kubenswrapper[4958]: I1201 10:16:58.780718 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" event={"ID":"925d86c1-9ff9-4290-a32b-142cdd162300","Type":"ContainerStarted","Data":"7f8be08541a372cea4aa4d395d4e8d32526d92b277befa46074946bc3084df24"} Dec 01 10:16:58 crc kubenswrapper[4958]: I1201 10:16:58.781796 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cd79p" event={"ID":"48822f3f-9799-45f5-9551-e2ac6966c560","Type":"ContainerStarted","Data":"815820bb8d1d3e557849e003052e9b4899ec21e38d5751e489945727d240ccc2"} Dec 01 10:16:59 crc kubenswrapper[4958]: I1201 10:16:59.816395 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerStarted","Data":"ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1"} Dec 01 10:16:59 crc kubenswrapper[4958]: I1201 10:16:59.829817 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cd79p" event={"ID":"48822f3f-9799-45f5-9551-e2ac6966c560","Type":"ContainerStarted","Data":"74233307409f97595a37cba41eef6427ce251d640b324460349b071c4e120ead"} Dec 01 10:16:59 crc kubenswrapper[4958]: I1201 10:16:59.830139 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cd79p" event={"ID":"48822f3f-9799-45f5-9551-e2ac6966c560","Type":"ContainerStarted","Data":"ce48c7306fc940f212b069b716b89ee94519d35b23729f5d44639b8c6e7bdd7c"} Dec 01 10:16:59 crc kubenswrapper[4958]: I1201 10:16:59.830222 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cd79p" Dec 01 10:16:59 crc kubenswrapper[4958]: I1201 10:16:59.856062 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqfhn" podStartSLOduration=2.248100105 podStartE2EDuration="5.856033853s" podCreationTimestamp="2025-12-01 10:16:54 +0000 UTC" firstStartedPulling="2025-12-01 10:16:55.729525648 +0000 UTC m=+1063.238314685" lastFinishedPulling="2025-12-01 10:16:59.337459396 +0000 UTC m=+1066.846248433" observedRunningTime="2025-12-01 10:16:59.85234352 +0000 UTC m=+1067.361132557" watchObservedRunningTime="2025-12-01 10:16:59.856033853 +0000 UTC m=+1067.364822890" Dec 01 10:16:59 crc kubenswrapper[4958]: I1201 10:16:59.890962 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cd79p" podStartSLOduration=3.8909344839999997 podStartE2EDuration="3.890934484s" podCreationTimestamp="2025-12-01 10:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:16:59.886758567 +0000 UTC m=+1067.395547604" watchObservedRunningTime="2025-12-01 10:16:59.890934484 +0000 UTC m=+1067.399723521" Dec 01 10:17:04 crc kubenswrapper[4958]: I1201 10:17:04.885977 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:17:04 crc kubenswrapper[4958]: I1201 10:17:04.886641 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:17:04 crc kubenswrapper[4958]: I1201 10:17:04.972972 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:17:05 crc kubenswrapper[4958]: I1201 10:17:05.954583 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:17:06 crc kubenswrapper[4958]: I1201 10:17:06.009551 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqfhn"] Dec 01 10:17:06 crc kubenswrapper[4958]: I1201 10:17:06.899384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" event={"ID":"925d86c1-9ff9-4290-a32b-142cdd162300","Type":"ContainerStarted","Data":"5cd133371f1995437e9021a146d29ffb2a1ac3f9e21c7c8912576268253c74af"} Dec 01 10:17:06 crc kubenswrapper[4958]: I1201 10:17:06.899991 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:17:06 crc kubenswrapper[4958]: I1201 10:17:06.901239 4958 generic.go:334] "Generic (PLEG): container finished" podID="4deb5735-ce1d-4302-9dad-783a64a13380" containerID="d80d33cffb43c625815ad68e0cddb4cb1ad5a5dd2ae3274c65fe9efdb3820399" exitCode=0 Dec 01 10:17:06 crc kubenswrapper[4958]: I1201 10:17:06.901401 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerDied","Data":"d80d33cffb43c625815ad68e0cddb4cb1ad5a5dd2ae3274c65fe9efdb3820399"} Dec 01 10:17:06 crc kubenswrapper[4958]: I1201 10:17:06.923128 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" podStartSLOduration=2.127848854 podStartE2EDuration="10.923103336s" podCreationTimestamp="2025-12-01 10:16:56 +0000 UTC" firstStartedPulling="2025-12-01 10:16:57.888889797 +0000 UTC m=+1065.397678844" lastFinishedPulling="2025-12-01 10:17:06.684144279 +0000 UTC m=+1074.192933326" observedRunningTime="2025-12-01 10:17:06.922180221 +0000 UTC m=+1074.430969268" watchObservedRunningTime="2025-12-01 10:17:06.923103336 +0000 UTC m=+1074.431892393" Dec 01 10:17:07 crc kubenswrapper[4958]: I1201 10:17:07.155521 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-vmp79" Dec 01 10:17:07 crc kubenswrapper[4958]: I1201 10:17:07.910219 4958 generic.go:334] "Generic (PLEG): container finished" podID="4deb5735-ce1d-4302-9dad-783a64a13380" containerID="1aa4d60f58733adca5e0646a3f28e3a90449bb7699c8cfc2fe0f9ef43ec11f78" exitCode=0 Dec 01 10:17:07 crc kubenswrapper[4958]: I1201 10:17:07.910346 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerDied","Data":"1aa4d60f58733adca5e0646a3f28e3a90449bb7699c8cfc2fe0f9ef43ec11f78"} Dec 01 10:17:07 crc kubenswrapper[4958]: I1201 10:17:07.911183 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqfhn" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="registry-server" containerID="cri-o://ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1" gracePeriod=2 Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.290155 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.408567 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-utilities\") pod \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.408728 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxjw6\" (UniqueName: \"kubernetes.io/projected/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-kube-api-access-gxjw6\") pod \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.408803 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-catalog-content\") pod \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\" (UID: \"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a\") " Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.410483 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-utilities" (OuterVolumeSpecName: "utilities") pod "d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" (UID: "d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.424214 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-kube-api-access-gxjw6" (OuterVolumeSpecName: "kube-api-access-gxjw6") pod "d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" (UID: "d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a"). InnerVolumeSpecName "kube-api-access-gxjw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.432796 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" (UID: "d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.510479 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxjw6\" (UniqueName: \"kubernetes.io/projected/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-kube-api-access-gxjw6\") on node \"crc\" DevicePath \"\"" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.510530 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.510559 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.634826 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cd79p" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.920973 4958 generic.go:334] "Generic (PLEG): container finished" podID="4deb5735-ce1d-4302-9dad-783a64a13380" containerID="157f364c8a3f9ee9ed3f1c7b0f517994a0a3f75c177534e79ec087ede1ccc185" exitCode=0 Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.921444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerDied","Data":"157f364c8a3f9ee9ed3f1c7b0f517994a0a3f75c177534e79ec087ede1ccc185"} Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.924341 4958 generic.go:334] "Generic (PLEG): container finished" podID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerID="ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1" exitCode=0 Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.924385 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerDied","Data":"ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1"} Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.924427 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqfhn" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.924451 4958 scope.go:117] "RemoveContainer" containerID="ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.924433 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqfhn" event={"ID":"d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a","Type":"ContainerDied","Data":"7356f8e8d95275f0ab3f21cc375ca97393c6ae0898e86dbdd72333bdc0087b1b"} Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.973470 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqfhn"] Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.975867 4958 scope.go:117] "RemoveContainer" containerID="a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824" Dec 01 10:17:08 crc kubenswrapper[4958]: I1201 10:17:08.978277 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqfhn"] Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.015019 4958 scope.go:117] "RemoveContainer" containerID="d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.041815 4958 scope.go:117] "RemoveContainer" containerID="ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1" Dec 01 10:17:09 crc kubenswrapper[4958]: E1201 10:17:09.042675 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1\": container with ID starting with ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1 not found: ID does not exist" containerID="ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.042735 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1"} err="failed to get container status \"ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1\": rpc error: code = NotFound desc = could not find container \"ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1\": container with ID starting with ea0f5d00bfe28a2ec3518fc5ec392d2d2e1e0c7ce6485ca990492dfef852f6f1 not found: ID does not exist" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.042771 4958 scope.go:117] "RemoveContainer" containerID="a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824" Dec 01 10:17:09 crc kubenswrapper[4958]: E1201 10:17:09.043163 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824\": container with ID starting with a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824 not found: ID does not exist" containerID="a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.043210 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824"} err="failed to get container status \"a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824\": rpc error: code = NotFound desc = could not find container \"a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824\": container with ID starting with a3e9cd2b694623368f14bfa983057e320e3254eb0aa28ff5f72eb82df6db4824 not found: ID does not exist" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.043240 4958 scope.go:117] "RemoveContainer" containerID="d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8" Dec 01 10:17:09 crc kubenswrapper[4958]: E1201 10:17:09.043482 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8\": container with ID starting with d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8 not found: ID does not exist" containerID="d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.043511 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8"} err="failed to get container status \"d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8\": rpc error: code = NotFound desc = could not find container \"d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8\": container with ID starting with d9151832441646568fb4efc91ec86b46b8533b333301c9f29b8456abbee53eb8 not found: ID does not exist" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.820219 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" path="/var/lib/kubelet/pods/d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a/volumes" Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.944949 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"5b667a2fefc8c6af56eb4a60f72edb77e90263b2a72b42bfde25a5ed90e3800d"} Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.945010 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"84c1e0e2cc6a9f033508bac9b3312ae9a747abf5b3ba259f225b66f41b67dfc1"} Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.945065 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"08d07689a2e24f37ec7065f9071d63b2f626d4d6dfa9ad633f7f097411512732"} Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.945077 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"b7cc298fda1313a2c441116bb7ff0e919992fd1a03421429bf988b7e09ea728c"} Dec 01 10:17:09 crc kubenswrapper[4958]: I1201 10:17:09.945088 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"58afe59a3f67dff547dac06ac730c80795360ec8c206fee2c1e0595b845a9056"} Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.262414 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q"] Dec 01 10:17:10 crc kubenswrapper[4958]: E1201 10:17:10.263120 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="extract-utilities" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.263137 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="extract-utilities" Dec 01 10:17:10 crc kubenswrapper[4958]: E1201 10:17:10.263151 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="registry-server" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.263157 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="registry-server" Dec 01 10:17:10 crc kubenswrapper[4958]: E1201 10:17:10.263179 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="extract-content" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.263188 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="extract-content" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.263302 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dbaa2a-b1c7-44e0-8585-7d0fac50e36a" containerName="registry-server" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.264311 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.267382 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.289897 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q"] Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.346704 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.347075 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4d7\" (UniqueName: \"kubernetes.io/projected/9e520358-0182-4324-865d-7bc6a261c2e4-kube-api-access-fd4d7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.347463 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.449359 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd4d7\" (UniqueName: \"kubernetes.io/projected/9e520358-0182-4324-865d-7bc6a261c2e4-kube-api-access-fd4d7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.449496 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.449524 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.450087 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.450196 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.484433 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd4d7\" (UniqueName: \"kubernetes.io/projected/9e520358-0182-4324-865d-7bc6a261c2e4-kube-api-access-fd4d7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.582030 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.962444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5gghz" event={"ID":"4deb5735-ce1d-4302-9dad-783a64a13380","Type":"ContainerStarted","Data":"618735d068cae968eb93c6893ff78dfd49348638dca9f94431c314c70ed552e1"} Dec 01 10:17:10 crc kubenswrapper[4958]: I1201 10:17:10.962928 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5gghz" Dec 01 10:17:11 crc kubenswrapper[4958]: I1201 10:17:11.009254 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5gghz" podStartSLOduration=5.530881753 podStartE2EDuration="15.009226656s" podCreationTimestamp="2025-12-01 10:16:56 +0000 UTC" firstStartedPulling="2025-12-01 10:16:57.164333151 +0000 UTC m=+1064.673122188" lastFinishedPulling="2025-12-01 10:17:06.642678044 +0000 UTC m=+1074.151467091" observedRunningTime="2025-12-01 10:17:11.00829661 +0000 UTC m=+1078.517085647" watchObservedRunningTime="2025-12-01 10:17:11.009226656 +0000 UTC m=+1078.518015703" Dec 01 10:17:11 crc kubenswrapper[4958]: I1201 10:17:11.046552 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q"] Dec 01 10:17:11 crc kubenswrapper[4958]: W1201 10:17:11.050959 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e520358_0182_4324_865d_7bc6a261c2e4.slice/crio-a684d5890a77136dc0727b929eafc8f369235a97c345fef1a25eb364e5a0bafa WatchSource:0}: Error finding container a684d5890a77136dc0727b929eafc8f369235a97c345fef1a25eb364e5a0bafa: Status 404 returned error can't find the container with id a684d5890a77136dc0727b929eafc8f369235a97c345fef1a25eb364e5a0bafa Dec 01 10:17:11 crc kubenswrapper[4958]: I1201 10:17:11.970711 4958 generic.go:334] "Generic (PLEG): container finished" podID="9e520358-0182-4324-865d-7bc6a261c2e4" containerID="b4517ed744d1117834c3f6aa41efc22727729fdcf92d2462810500d48bcf85b4" exitCode=0 Dec 01 10:17:11 crc kubenswrapper[4958]: I1201 10:17:11.970779 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" event={"ID":"9e520358-0182-4324-865d-7bc6a261c2e4","Type":"ContainerDied","Data":"b4517ed744d1117834c3f6aa41efc22727729fdcf92d2462810500d48bcf85b4"} Dec 01 10:17:11 crc kubenswrapper[4958]: I1201 10:17:11.971159 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" event={"ID":"9e520358-0182-4324-865d-7bc6a261c2e4","Type":"ContainerStarted","Data":"a684d5890a77136dc0727b929eafc8f369235a97c345fef1a25eb364e5a0bafa"} Dec 01 10:17:12 crc kubenswrapper[4958]: I1201 10:17:12.013865 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5gghz" Dec 01 10:17:12 crc kubenswrapper[4958]: I1201 10:17:12.059144 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5gghz" Dec 01 10:17:15 crc kubenswrapper[4958]: I1201 10:17:15.998384 4958 generic.go:334] "Generic (PLEG): container finished" podID="9e520358-0182-4324-865d-7bc6a261c2e4" containerID="90c6f2e6dc06f13523e1cefb630bacd5222e6075b8d7316f8413f5e31578e778" exitCode=0 Dec 01 10:17:15 crc kubenswrapper[4958]: I1201 10:17:15.998465 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" event={"ID":"9e520358-0182-4324-865d-7bc6a261c2e4","Type":"ContainerDied","Data":"90c6f2e6dc06f13523e1cefb630bacd5222e6075b8d7316f8413f5e31578e778"} Dec 01 10:17:17 crc kubenswrapper[4958]: I1201 10:17:17.009893 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" event={"ID":"9e520358-0182-4324-865d-7bc6a261c2e4","Type":"ContainerDied","Data":"9d75b2b4b43f8dc005bb00a9b8a35deb31a97f221427eee66feb1caa8aa6916a"} Dec 01 10:17:17 crc kubenswrapper[4958]: I1201 10:17:17.009821 4958 generic.go:334] "Generic (PLEG): container finished" podID="9e520358-0182-4324-865d-7bc6a261c2e4" containerID="9d75b2b4b43f8dc005bb00a9b8a35deb31a97f221427eee66feb1caa8aa6916a" exitCode=0 Dec 01 10:17:17 crc kubenswrapper[4958]: I1201 10:17:17.631448 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-clzbq" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.290776 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.389268 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-util\") pod \"9e520358-0182-4324-865d-7bc6a261c2e4\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.389763 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd4d7\" (UniqueName: \"kubernetes.io/projected/9e520358-0182-4324-865d-7bc6a261c2e4-kube-api-access-fd4d7\") pod \"9e520358-0182-4324-865d-7bc6a261c2e4\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.389835 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-bundle\") pod \"9e520358-0182-4324-865d-7bc6a261c2e4\" (UID: \"9e520358-0182-4324-865d-7bc6a261c2e4\") " Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.391513 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-bundle" (OuterVolumeSpecName: "bundle") pod "9e520358-0182-4324-865d-7bc6a261c2e4" (UID: "9e520358-0182-4324-865d-7bc6a261c2e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.401178 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-util" (OuterVolumeSpecName: "util") pod "9e520358-0182-4324-865d-7bc6a261c2e4" (UID: "9e520358-0182-4324-865d-7bc6a261c2e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.403039 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e520358-0182-4324-865d-7bc6a261c2e4-kube-api-access-fd4d7" (OuterVolumeSpecName: "kube-api-access-fd4d7") pod "9e520358-0182-4324-865d-7bc6a261c2e4" (UID: "9e520358-0182-4324-865d-7bc6a261c2e4"). InnerVolumeSpecName "kube-api-access-fd4d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.491971 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.492030 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd4d7\" (UniqueName: \"kubernetes.io/projected/9e520358-0182-4324-865d-7bc6a261c2e4-kube-api-access-fd4d7\") on node \"crc\" DevicePath \"\"" Dec 01 10:17:18 crc kubenswrapper[4958]: I1201 10:17:18.492050 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e520358-0182-4324-865d-7bc6a261c2e4-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:17:19 crc kubenswrapper[4958]: I1201 10:17:19.029697 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" event={"ID":"9e520358-0182-4324-865d-7bc6a261c2e4","Type":"ContainerDied","Data":"a684d5890a77136dc0727b929eafc8f369235a97c345fef1a25eb364e5a0bafa"} Dec 01 10:17:19 crc kubenswrapper[4958]: I1201 10:17:19.029762 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a684d5890a77136dc0727b929eafc8f369235a97c345fef1a25eb364e5a0bafa" Dec 01 10:17:19 crc kubenswrapper[4958]: I1201 10:17:19.029777 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.014296 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9"] Dec 01 10:17:24 crc kubenswrapper[4958]: E1201 10:17:24.014992 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="util" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.015009 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="util" Dec 01 10:17:24 crc kubenswrapper[4958]: E1201 10:17:24.015022 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="pull" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.015028 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="pull" Dec 01 10:17:24 crc kubenswrapper[4958]: E1201 10:17:24.015044 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="extract" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.015051 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="extract" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.015168 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e520358-0182-4324-865d-7bc6a261c2e4" containerName="extract" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.015688 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.021641 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.022141 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.022495 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-8d75s" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.031333 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9"] Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.081659 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4222acd7-6068-4cb1-a0a2-b4bc3ba891d9-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-sqwj9\" (UID: \"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.081768 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxxq\" (UniqueName: \"kubernetes.io/projected/4222acd7-6068-4cb1-a0a2-b4bc3ba891d9-kube-api-access-dbxxq\") pod \"cert-manager-operator-controller-manager-64cf6dff88-sqwj9\" (UID: \"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.184330 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4222acd7-6068-4cb1-a0a2-b4bc3ba891d9-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-sqwj9\" (UID: \"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.184425 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxxq\" (UniqueName: \"kubernetes.io/projected/4222acd7-6068-4cb1-a0a2-b4bc3ba891d9-kube-api-access-dbxxq\") pod \"cert-manager-operator-controller-manager-64cf6dff88-sqwj9\" (UID: \"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.185070 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4222acd7-6068-4cb1-a0a2-b4bc3ba891d9-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-sqwj9\" (UID: \"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.207150 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxxq\" (UniqueName: \"kubernetes.io/projected/4222acd7-6068-4cb1-a0a2-b4bc3ba891d9-kube-api-access-dbxxq\") pod \"cert-manager-operator-controller-manager-64cf6dff88-sqwj9\" (UID: \"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.335988 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" Dec 01 10:17:24 crc kubenswrapper[4958]: I1201 10:17:24.587080 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9"] Dec 01 10:17:24 crc kubenswrapper[4958]: W1201 10:17:24.594619 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4222acd7_6068_4cb1_a0a2_b4bc3ba891d9.slice/crio-036bc6668f87c87be62bad2256339f6492b75fd5af21a19bd33a462140e29979 WatchSource:0}: Error finding container 036bc6668f87c87be62bad2256339f6492b75fd5af21a19bd33a462140e29979: Status 404 returned error can't find the container with id 036bc6668f87c87be62bad2256339f6492b75fd5af21a19bd33a462140e29979 Dec 01 10:17:25 crc kubenswrapper[4958]: I1201 10:17:25.065993 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" event={"ID":"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9","Type":"ContainerStarted","Data":"036bc6668f87c87be62bad2256339f6492b75fd5af21a19bd33a462140e29979"} Dec 01 10:17:27 crc kubenswrapper[4958]: I1201 10:17:27.216376 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5gghz" Dec 01 10:17:28 crc kubenswrapper[4958]: I1201 10:17:28.210824 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:17:28 crc kubenswrapper[4958]: I1201 10:17:28.210913 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:17:34 crc kubenswrapper[4958]: I1201 10:17:34.254801 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" event={"ID":"4222acd7-6068-4cb1-a0a2-b4bc3ba891d9","Type":"ContainerStarted","Data":"2ef7b4fb7f166b296ccc1c0d2ffd01368adcf31536f62cb5a578f5670bfc9cf5"} Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.245404 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-sqwj9" podStartSLOduration=5.455714519 podStartE2EDuration="14.245373854s" podCreationTimestamp="2025-12-01 10:17:23 +0000 UTC" firstStartedPulling="2025-12-01 10:17:24.597550388 +0000 UTC m=+1092.106339425" lastFinishedPulling="2025-12-01 10:17:33.387209723 +0000 UTC m=+1100.895998760" observedRunningTime="2025-12-01 10:17:34.28220318 +0000 UTC m=+1101.790992217" watchObservedRunningTime="2025-12-01 10:17:37.245373854 +0000 UTC m=+1104.754162891" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.248284 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-26gvw"] Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.249488 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.252077 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.252591 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.252665 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cs6xc" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.261835 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-26gvw"] Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.339819 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86303ba1-0515-43b8-92ec-14f433ad784c-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-26gvw\" (UID: \"86303ba1-0515-43b8-92ec-14f433ad784c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.340523 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkcb\" (UniqueName: \"kubernetes.io/projected/86303ba1-0515-43b8-92ec-14f433ad784c-kube-api-access-shkcb\") pod \"cert-manager-webhook-f4fb5df64-26gvw\" (UID: \"86303ba1-0515-43b8-92ec-14f433ad784c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.441802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86303ba1-0515-43b8-92ec-14f433ad784c-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-26gvw\" (UID: \"86303ba1-0515-43b8-92ec-14f433ad784c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.441926 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shkcb\" (UniqueName: \"kubernetes.io/projected/86303ba1-0515-43b8-92ec-14f433ad784c-kube-api-access-shkcb\") pod \"cert-manager-webhook-f4fb5df64-26gvw\" (UID: \"86303ba1-0515-43b8-92ec-14f433ad784c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.471171 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86303ba1-0515-43b8-92ec-14f433ad784c-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-26gvw\" (UID: \"86303ba1-0515-43b8-92ec-14f433ad784c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.472809 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shkcb\" (UniqueName: \"kubernetes.io/projected/86303ba1-0515-43b8-92ec-14f433ad784c-kube-api-access-shkcb\") pod \"cert-manager-webhook-f4fb5df64-26gvw\" (UID: \"86303ba1-0515-43b8-92ec-14f433ad784c\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:37 crc kubenswrapper[4958]: I1201 10:17:37.573041 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:38 crc kubenswrapper[4958]: I1201 10:17:38.054399 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-26gvw"] Dec 01 10:17:38 crc kubenswrapper[4958]: I1201 10:17:38.279451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" event={"ID":"86303ba1-0515-43b8-92ec-14f433ad784c","Type":"ContainerStarted","Data":"a93b6ab9558ec09dbe9b5c8a2a3ea0eeb32faa458afbbf06ce031e6f239b204d"} Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.144926 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-88hgg"] Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.146336 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.149527 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wrj8z" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.156663 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-88hgg"] Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.299775 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8cde77f-dd6f-47cf-9978-dd91ea966f6a-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-88hgg\" (UID: \"d8cde77f-dd6f-47cf-9978-dd91ea966f6a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.299879 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s5nd\" (UniqueName: \"kubernetes.io/projected/d8cde77f-dd6f-47cf-9978-dd91ea966f6a-kube-api-access-8s5nd\") pod \"cert-manager-cainjector-855d9ccff4-88hgg\" (UID: \"d8cde77f-dd6f-47cf-9978-dd91ea966f6a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.402172 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s5nd\" (UniqueName: \"kubernetes.io/projected/d8cde77f-dd6f-47cf-9978-dd91ea966f6a-kube-api-access-8s5nd\") pod \"cert-manager-cainjector-855d9ccff4-88hgg\" (UID: \"d8cde77f-dd6f-47cf-9978-dd91ea966f6a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.402326 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8cde77f-dd6f-47cf-9978-dd91ea966f6a-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-88hgg\" (UID: \"d8cde77f-dd6f-47cf-9978-dd91ea966f6a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.423877 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s5nd\" (UniqueName: \"kubernetes.io/projected/d8cde77f-dd6f-47cf-9978-dd91ea966f6a-kube-api-access-8s5nd\") pod \"cert-manager-cainjector-855d9ccff4-88hgg\" (UID: \"d8cde77f-dd6f-47cf-9978-dd91ea966f6a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.435056 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8cde77f-dd6f-47cf-9978-dd91ea966f6a-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-88hgg\" (UID: \"d8cde77f-dd6f-47cf-9978-dd91ea966f6a\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.471140 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" Dec 01 10:17:40 crc kubenswrapper[4958]: I1201 10:17:40.805626 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-88hgg"] Dec 01 10:17:40 crc kubenswrapper[4958]: W1201 10:17:40.820463 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8cde77f_dd6f_47cf_9978_dd91ea966f6a.slice/crio-baca6049644cbf6e94f9f2877d0386e21bdbc7940e35f84520afde7cae5c74a7 WatchSource:0}: Error finding container baca6049644cbf6e94f9f2877d0386e21bdbc7940e35f84520afde7cae5c74a7: Status 404 returned error can't find the container with id baca6049644cbf6e94f9f2877d0386e21bdbc7940e35f84520afde7cae5c74a7 Dec 01 10:17:41 crc kubenswrapper[4958]: I1201 10:17:41.318347 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" event={"ID":"d8cde77f-dd6f-47cf-9978-dd91ea966f6a","Type":"ContainerStarted","Data":"baca6049644cbf6e94f9f2877d0386e21bdbc7940e35f84520afde7cae5c74a7"} Dec 01 10:17:50 crc kubenswrapper[4958]: I1201 10:17:50.393574 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" event={"ID":"86303ba1-0515-43b8-92ec-14f433ad784c","Type":"ContainerStarted","Data":"16723d55bc9551c4e8fdee05a1ae55ce58ba5605d61e533742c6ff4c6a779514"} Dec 01 10:17:50 crc kubenswrapper[4958]: I1201 10:17:50.394240 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:50 crc kubenswrapper[4958]: I1201 10:17:50.396175 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" event={"ID":"d8cde77f-dd6f-47cf-9978-dd91ea966f6a","Type":"ContainerStarted","Data":"86aace47c96203c6c6813315641e3aac2317bebc1dfb8a0a2e7e6becb0bd6ab7"} Dec 01 10:17:50 crc kubenswrapper[4958]: I1201 10:17:50.416893 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" podStartSLOduration=1.627058639 podStartE2EDuration="13.41686714s" podCreationTimestamp="2025-12-01 10:17:37 +0000 UTC" firstStartedPulling="2025-12-01 10:17:38.064952222 +0000 UTC m=+1105.573741249" lastFinishedPulling="2025-12-01 10:17:49.854760713 +0000 UTC m=+1117.363549750" observedRunningTime="2025-12-01 10:17:50.416632083 +0000 UTC m=+1117.925421120" watchObservedRunningTime="2025-12-01 10:17:50.41686714 +0000 UTC m=+1117.925656177" Dec 01 10:17:50 crc kubenswrapper[4958]: I1201 10:17:50.442498 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-88hgg" podStartSLOduration=1.419347228 podStartE2EDuration="10.442468944s" podCreationTimestamp="2025-12-01 10:17:40 +0000 UTC" firstStartedPulling="2025-12-01 10:17:40.824285016 +0000 UTC m=+1108.333074053" lastFinishedPulling="2025-12-01 10:17:49.847406732 +0000 UTC m=+1117.356195769" observedRunningTime="2025-12-01 10:17:50.440457756 +0000 UTC m=+1117.949246803" watchObservedRunningTime="2025-12-01 10:17:50.442468944 +0000 UTC m=+1117.951257981" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.257813 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-2z2hp"] Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.260093 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.263160 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-plpm5" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.269291 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-2z2hp"] Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.333431 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/431d1004-c19f-4715-aa26-98a7c8e1e63d-bound-sa-token\") pod \"cert-manager-86cb77c54b-2z2hp\" (UID: \"431d1004-c19f-4715-aa26-98a7c8e1e63d\") " pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.333547 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59n5d\" (UniqueName: \"kubernetes.io/projected/431d1004-c19f-4715-aa26-98a7c8e1e63d-kube-api-access-59n5d\") pod \"cert-manager-86cb77c54b-2z2hp\" (UID: \"431d1004-c19f-4715-aa26-98a7c8e1e63d\") " pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.434676 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59n5d\" (UniqueName: \"kubernetes.io/projected/431d1004-c19f-4715-aa26-98a7c8e1e63d-kube-api-access-59n5d\") pod \"cert-manager-86cb77c54b-2z2hp\" (UID: \"431d1004-c19f-4715-aa26-98a7c8e1e63d\") " pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.434807 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/431d1004-c19f-4715-aa26-98a7c8e1e63d-bound-sa-token\") pod \"cert-manager-86cb77c54b-2z2hp\" (UID: \"431d1004-c19f-4715-aa26-98a7c8e1e63d\") " pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.453891 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/431d1004-c19f-4715-aa26-98a7c8e1e63d-bound-sa-token\") pod \"cert-manager-86cb77c54b-2z2hp\" (UID: \"431d1004-c19f-4715-aa26-98a7c8e1e63d\") " pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.454192 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59n5d\" (UniqueName: \"kubernetes.io/projected/431d1004-c19f-4715-aa26-98a7c8e1e63d-kube-api-access-59n5d\") pod \"cert-manager-86cb77c54b-2z2hp\" (UID: \"431d1004-c19f-4715-aa26-98a7c8e1e63d\") " pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.584151 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-2z2hp" Dec 01 10:17:56 crc kubenswrapper[4958]: I1201 10:17:56.923393 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-2z2hp"] Dec 01 10:17:56 crc kubenswrapper[4958]: W1201 10:17:56.928060 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod431d1004_c19f_4715_aa26_98a7c8e1e63d.slice/crio-ef4aca545f5caf12fdc9dcfa9c3b37c9b33f3a74be8a0b926570986e95d27d79 WatchSource:0}: Error finding container ef4aca545f5caf12fdc9dcfa9c3b37c9b33f3a74be8a0b926570986e95d27d79: Status 404 returned error can't find the container with id ef4aca545f5caf12fdc9dcfa9c3b37c9b33f3a74be8a0b926570986e95d27d79 Dec 01 10:17:57 crc kubenswrapper[4958]: I1201 10:17:57.445313 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-2z2hp" event={"ID":"431d1004-c19f-4715-aa26-98a7c8e1e63d","Type":"ContainerStarted","Data":"43347b7a7166fff1261038a97af89029cc4f9ad92a3b1258dd489ad2eefb9c85"} Dec 01 10:17:57 crc kubenswrapper[4958]: I1201 10:17:57.445658 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-2z2hp" event={"ID":"431d1004-c19f-4715-aa26-98a7c8e1e63d","Type":"ContainerStarted","Data":"ef4aca545f5caf12fdc9dcfa9c3b37c9b33f3a74be8a0b926570986e95d27d79"} Dec 01 10:17:57 crc kubenswrapper[4958]: I1201 10:17:57.466127 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-2z2hp" podStartSLOduration=1.466096371 podStartE2EDuration="1.466096371s" podCreationTimestamp="2025-12-01 10:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:17:57.462654292 +0000 UTC m=+1124.971443329" watchObservedRunningTime="2025-12-01 10:17:57.466096371 +0000 UTC m=+1124.974885408" Dec 01 10:17:57 crc kubenswrapper[4958]: I1201 10:17:57.575595 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-26gvw" Dec 01 10:17:58 crc kubenswrapper[4958]: I1201 10:17:58.210777 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:17:58 crc kubenswrapper[4958]: I1201 10:17:58.210896 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.803569 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wzl2n"] Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.804930 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.810542 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.810921 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-shwq6" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.811099 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.820669 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wzl2n"] Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.860485 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxg5p\" (UniqueName: \"kubernetes.io/projected/b0ad5e85-f64a-4435-8fa3-c062111c8bf9-kube-api-access-bxg5p\") pod \"openstack-operator-index-wzl2n\" (UID: \"b0ad5e85-f64a-4435-8fa3-c062111c8bf9\") " pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.961919 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxg5p\" (UniqueName: \"kubernetes.io/projected/b0ad5e85-f64a-4435-8fa3-c062111c8bf9-kube-api-access-bxg5p\") pod \"openstack-operator-index-wzl2n\" (UID: \"b0ad5e85-f64a-4435-8fa3-c062111c8bf9\") " pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:00 crc kubenswrapper[4958]: I1201 10:18:00.995670 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxg5p\" (UniqueName: \"kubernetes.io/projected/b0ad5e85-f64a-4435-8fa3-c062111c8bf9-kube-api-access-bxg5p\") pod \"openstack-operator-index-wzl2n\" (UID: \"b0ad5e85-f64a-4435-8fa3-c062111c8bf9\") " pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:01 crc kubenswrapper[4958]: I1201 10:18:01.164662 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:01 crc kubenswrapper[4958]: I1201 10:18:01.859719 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wzl2n"] Dec 01 10:18:02 crc kubenswrapper[4958]: I1201 10:18:02.502568 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wzl2n" event={"ID":"b0ad5e85-f64a-4435-8fa3-c062111c8bf9","Type":"ContainerStarted","Data":"cbcfcad7ac715c5609dd3ef4de2a0354ecc64a213b5c5c463d01c3d088ed84ac"} Dec 01 10:18:03 crc kubenswrapper[4958]: I1201 10:18:03.578704 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wzl2n"] Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.186772 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p7jbk"] Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.188327 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.226432 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p7jbk"] Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.263808 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnkc\" (UniqueName: \"kubernetes.io/projected/3433758a-1a2d-48eb-9d78-f21c8b1d9bbf-kube-api-access-vpnkc\") pod \"openstack-operator-index-p7jbk\" (UID: \"3433758a-1a2d-48eb-9d78-f21c8b1d9bbf\") " pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.365652 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnkc\" (UniqueName: \"kubernetes.io/projected/3433758a-1a2d-48eb-9d78-f21c8b1d9bbf-kube-api-access-vpnkc\") pod \"openstack-operator-index-p7jbk\" (UID: \"3433758a-1a2d-48eb-9d78-f21c8b1d9bbf\") " pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.386742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnkc\" (UniqueName: \"kubernetes.io/projected/3433758a-1a2d-48eb-9d78-f21c8b1d9bbf-kube-api-access-vpnkc\") pod \"openstack-operator-index-p7jbk\" (UID: \"3433758a-1a2d-48eb-9d78-f21c8b1d9bbf\") " pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.509328 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.516727 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wzl2n" event={"ID":"b0ad5e85-f64a-4435-8fa3-c062111c8bf9","Type":"ContainerStarted","Data":"79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956"} Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.516922 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wzl2n" podUID="b0ad5e85-f64a-4435-8fa3-c062111c8bf9" containerName="registry-server" containerID="cri-o://79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956" gracePeriod=2 Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.548983 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wzl2n" podStartSLOduration=2.524326486 podStartE2EDuration="4.548947914s" podCreationTimestamp="2025-12-01 10:18:00 +0000 UTC" firstStartedPulling="2025-12-01 10:18:01.868656117 +0000 UTC m=+1129.377445154" lastFinishedPulling="2025-12-01 10:18:03.893277545 +0000 UTC m=+1131.402066582" observedRunningTime="2025-12-01 10:18:04.537534627 +0000 UTC m=+1132.046323664" watchObservedRunningTime="2025-12-01 10:18:04.548947914 +0000 UTC m=+1132.057736951" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.742960 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p7jbk"] Dec 01 10:18:04 crc kubenswrapper[4958]: W1201 10:18:04.762082 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3433758a_1a2d_48eb_9d78_f21c8b1d9bbf.slice/crio-0fb5f58a12bcd464668d06522ef23910506a152cceca9d721a245b6306c36ea2 WatchSource:0}: Error finding container 0fb5f58a12bcd464668d06522ef23910506a152cceca9d721a245b6306c36ea2: Status 404 returned error can't find the container with id 0fb5f58a12bcd464668d06522ef23910506a152cceca9d721a245b6306c36ea2 Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.851456 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:04 crc kubenswrapper[4958]: I1201 10:18:04.985453 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxg5p\" (UniqueName: \"kubernetes.io/projected/b0ad5e85-f64a-4435-8fa3-c062111c8bf9-kube-api-access-bxg5p\") pod \"b0ad5e85-f64a-4435-8fa3-c062111c8bf9\" (UID: \"b0ad5e85-f64a-4435-8fa3-c062111c8bf9\") " Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.096356 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ad5e85-f64a-4435-8fa3-c062111c8bf9-kube-api-access-bxg5p" (OuterVolumeSpecName: "kube-api-access-bxg5p") pod "b0ad5e85-f64a-4435-8fa3-c062111c8bf9" (UID: "b0ad5e85-f64a-4435-8fa3-c062111c8bf9"). InnerVolumeSpecName "kube-api-access-bxg5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.188182 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxg5p\" (UniqueName: \"kubernetes.io/projected/b0ad5e85-f64a-4435-8fa3-c062111c8bf9-kube-api-access-bxg5p\") on node \"crc\" DevicePath \"\"" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.524534 4958 generic.go:334] "Generic (PLEG): container finished" podID="b0ad5e85-f64a-4435-8fa3-c062111c8bf9" containerID="79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956" exitCode=0 Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.524596 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wzl2n" event={"ID":"b0ad5e85-f64a-4435-8fa3-c062111c8bf9","Type":"ContainerDied","Data":"79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956"} Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.524625 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wzl2n" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.524648 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wzl2n" event={"ID":"b0ad5e85-f64a-4435-8fa3-c062111c8bf9","Type":"ContainerDied","Data":"cbcfcad7ac715c5609dd3ef4de2a0354ecc64a213b5c5c463d01c3d088ed84ac"} Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.524674 4958 scope.go:117] "RemoveContainer" containerID="79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.525910 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p7jbk" event={"ID":"3433758a-1a2d-48eb-9d78-f21c8b1d9bbf","Type":"ContainerStarted","Data":"0d1a44794a8b3244a522eae601e5e22a1b8fc7aad72d72cf1558202c29e5d048"} Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.525945 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p7jbk" event={"ID":"3433758a-1a2d-48eb-9d78-f21c8b1d9bbf","Type":"ContainerStarted","Data":"0fb5f58a12bcd464668d06522ef23910506a152cceca9d721a245b6306c36ea2"} Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.543151 4958 scope.go:117] "RemoveContainer" containerID="79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956" Dec 01 10:18:05 crc kubenswrapper[4958]: E1201 10:18:05.543605 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956\": container with ID starting with 79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956 not found: ID does not exist" containerID="79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.543660 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956"} err="failed to get container status \"79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956\": rpc error: code = NotFound desc = could not find container \"79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956\": container with ID starting with 79ddbdd3a24b01ab53df40c970a2185f6ec1b48f816be36628c48ced08059956 not found: ID does not exist" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.562543 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p7jbk" podStartSLOduration=1.519641676 podStartE2EDuration="1.562515205s" podCreationTimestamp="2025-12-01 10:18:04 +0000 UTC" firstStartedPulling="2025-12-01 10:18:04.765433941 +0000 UTC m=+1132.274222978" lastFinishedPulling="2025-12-01 10:18:04.80830746 +0000 UTC m=+1132.317096507" observedRunningTime="2025-12-01 10:18:05.556266286 +0000 UTC m=+1133.065055323" watchObservedRunningTime="2025-12-01 10:18:05.562515205 +0000 UTC m=+1133.071304242" Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.577199 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wzl2n"] Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.582936 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wzl2n"] Dec 01 10:18:05 crc kubenswrapper[4958]: I1201 10:18:05.807467 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ad5e85-f64a-4435-8fa3-c062111c8bf9" path="/var/lib/kubelet/pods/b0ad5e85-f64a-4435-8fa3-c062111c8bf9/volumes" Dec 01 10:18:14 crc kubenswrapper[4958]: I1201 10:18:14.509941 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:14 crc kubenswrapper[4958]: I1201 10:18:14.512253 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:14 crc kubenswrapper[4958]: I1201 10:18:14.538047 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:14 crc kubenswrapper[4958]: I1201 10:18:14.615700 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p7jbk" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.246930 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd"] Dec 01 10:18:16 crc kubenswrapper[4958]: E1201 10:18:16.247321 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ad5e85-f64a-4435-8fa3-c062111c8bf9" containerName="registry-server" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.247338 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ad5e85-f64a-4435-8fa3-c062111c8bf9" containerName="registry-server" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.247487 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ad5e85-f64a-4435-8fa3-c062111c8bf9" containerName="registry-server" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.248503 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.251072 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8rz85" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.264769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd"] Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.371638 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96kgx\" (UniqueName: \"kubernetes.io/projected/059f6f09-716d-4f5e-9afd-bb9328f783cc-kube-api-access-96kgx\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.371737 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-util\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.371816 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-bundle\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.473626 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-util\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.474175 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-bundle\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.474716 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96kgx\" (UniqueName: \"kubernetes.io/projected/059f6f09-716d-4f5e-9afd-bb9328f783cc-kube-api-access-96kgx\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.474425 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-util\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.474769 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-bundle\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.494545 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96kgx\" (UniqueName: \"kubernetes.io/projected/059f6f09-716d-4f5e-9afd-bb9328f783cc-kube-api-access-96kgx\") pod \"d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:16 crc kubenswrapper[4958]: I1201 10:18:16.595426 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:17 crc kubenswrapper[4958]: I1201 10:18:17.014702 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd"] Dec 01 10:18:17 crc kubenswrapper[4958]: W1201 10:18:17.024394 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059f6f09_716d_4f5e_9afd_bb9328f783cc.slice/crio-9b9824831ef399dd8a9c003b0a50074676a7c9d107d7274a6450854bd35b3486 WatchSource:0}: Error finding container 9b9824831ef399dd8a9c003b0a50074676a7c9d107d7274a6450854bd35b3486: Status 404 returned error can't find the container with id 9b9824831ef399dd8a9c003b0a50074676a7c9d107d7274a6450854bd35b3486 Dec 01 10:18:17 crc kubenswrapper[4958]: I1201 10:18:17.611268 4958 generic.go:334] "Generic (PLEG): container finished" podID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerID="3cd23608d1257de16284e7aedd235abc3575d08b967519fa390d82a1b69e60ee" exitCode=0 Dec 01 10:18:17 crc kubenswrapper[4958]: I1201 10:18:17.611397 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" event={"ID":"059f6f09-716d-4f5e-9afd-bb9328f783cc","Type":"ContainerDied","Data":"3cd23608d1257de16284e7aedd235abc3575d08b967519fa390d82a1b69e60ee"} Dec 01 10:18:17 crc kubenswrapper[4958]: I1201 10:18:17.613680 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" event={"ID":"059f6f09-716d-4f5e-9afd-bb9328f783cc","Type":"ContainerStarted","Data":"9b9824831ef399dd8a9c003b0a50074676a7c9d107d7274a6450854bd35b3486"} Dec 01 10:18:20 crc kubenswrapper[4958]: I1201 10:18:20.645449 4958 generic.go:334] "Generic (PLEG): container finished" podID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerID="98b38ea41b37b4aa86554c99ea4a551a1ce56fb2f58f3593de88d87b7f3e3664" exitCode=0 Dec 01 10:18:20 crc kubenswrapper[4958]: I1201 10:18:20.646022 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" event={"ID":"059f6f09-716d-4f5e-9afd-bb9328f783cc","Type":"ContainerDied","Data":"98b38ea41b37b4aa86554c99ea4a551a1ce56fb2f58f3593de88d87b7f3e3664"} Dec 01 10:18:21 crc kubenswrapper[4958]: I1201 10:18:21.658482 4958 generic.go:334] "Generic (PLEG): container finished" podID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerID="a31b24c5fbe715264cb5831e261e805e46de97f6b71ad787c6523373d73b32d6" exitCode=0 Dec 01 10:18:21 crc kubenswrapper[4958]: I1201 10:18:21.658568 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" event={"ID":"059f6f09-716d-4f5e-9afd-bb9328f783cc","Type":"ContainerDied","Data":"a31b24c5fbe715264cb5831e261e805e46de97f6b71ad787c6523373d73b32d6"} Dec 01 10:18:22 crc kubenswrapper[4958]: I1201 10:18:22.920257 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.029899 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96kgx\" (UniqueName: \"kubernetes.io/projected/059f6f09-716d-4f5e-9afd-bb9328f783cc-kube-api-access-96kgx\") pod \"059f6f09-716d-4f5e-9afd-bb9328f783cc\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.029997 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-bundle\") pod \"059f6f09-716d-4f5e-9afd-bb9328f783cc\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.030201 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-util\") pod \"059f6f09-716d-4f5e-9afd-bb9328f783cc\" (UID: \"059f6f09-716d-4f5e-9afd-bb9328f783cc\") " Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.030956 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-bundle" (OuterVolumeSpecName: "bundle") pod "059f6f09-716d-4f5e-9afd-bb9328f783cc" (UID: "059f6f09-716d-4f5e-9afd-bb9328f783cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.039987 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059f6f09-716d-4f5e-9afd-bb9328f783cc-kube-api-access-96kgx" (OuterVolumeSpecName: "kube-api-access-96kgx") pod "059f6f09-716d-4f5e-9afd-bb9328f783cc" (UID: "059f6f09-716d-4f5e-9afd-bb9328f783cc"). InnerVolumeSpecName "kube-api-access-96kgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.041437 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-util" (OuterVolumeSpecName: "util") pod "059f6f09-716d-4f5e-9afd-bb9328f783cc" (UID: "059f6f09-716d-4f5e-9afd-bb9328f783cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.132024 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-util\") on node \"crc\" DevicePath \"\"" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.132069 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96kgx\" (UniqueName: \"kubernetes.io/projected/059f6f09-716d-4f5e-9afd-bb9328f783cc-kube-api-access-96kgx\") on node \"crc\" DevicePath \"\"" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.132083 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/059f6f09-716d-4f5e-9afd-bb9328f783cc-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.675101 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" event={"ID":"059f6f09-716d-4f5e-9afd-bb9328f783cc","Type":"ContainerDied","Data":"9b9824831ef399dd8a9c003b0a50074676a7c9d107d7274a6450854bd35b3486"} Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.675169 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b9824831ef399dd8a9c003b0a50074676a7c9d107d7274a6450854bd35b3486" Dec 01 10:18:23 crc kubenswrapper[4958]: I1201 10:18:23.675195 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd" Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.210212 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.210584 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.210663 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.211559 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"269d028bc69c92127b6b1ad5b3c8a371b2530bdef6d634e5295d699f9be45b99"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.211640 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://269d028bc69c92127b6b1ad5b3c8a371b2530bdef6d634e5295d699f9be45b99" gracePeriod=600 Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.711609 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="269d028bc69c92127b6b1ad5b3c8a371b2530bdef6d634e5295d699f9be45b99" exitCode=0 Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.711656 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"269d028bc69c92127b6b1ad5b3c8a371b2530bdef6d634e5295d699f9be45b99"} Dec 01 10:18:28 crc kubenswrapper[4958]: I1201 10:18:28.712054 4958 scope.go:117] "RemoveContainer" containerID="5883cf8aac81784159b4994d61b5520f0d5f082b5d1aeaaa67bc8390bfe65ba5" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.406963 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k"] Dec 01 10:18:29 crc kubenswrapper[4958]: E1201 10:18:29.407774 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="extract" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.407796 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="extract" Dec 01 10:18:29 crc kubenswrapper[4958]: E1201 10:18:29.407830 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="pull" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.407859 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="pull" Dec 01 10:18:29 crc kubenswrapper[4958]: E1201 10:18:29.407875 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="util" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.407882 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="util" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.408037 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="059f6f09-716d-4f5e-9afd-bb9328f783cc" containerName="extract" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.409997 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.414585 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-z2g24" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.429207 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k"] Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.446402 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4nhp\" (UniqueName: \"kubernetes.io/projected/6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b-kube-api-access-s4nhp\") pod \"openstack-operator-controller-operator-5b9cbd897-lz77k\" (UID: \"6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b\") " pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.547806 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4nhp\" (UniqueName: \"kubernetes.io/projected/6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b-kube-api-access-s4nhp\") pod \"openstack-operator-controller-operator-5b9cbd897-lz77k\" (UID: \"6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b\") " pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.573274 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4nhp\" (UniqueName: \"kubernetes.io/projected/6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b-kube-api-access-s4nhp\") pod \"openstack-operator-controller-operator-5b9cbd897-lz77k\" (UID: \"6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b\") " pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.721834 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"f09314f73af3b199ee4f78ab6cf71768969c699a4967f9d91c7d9dc73162183f"} Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.727774 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:29 crc kubenswrapper[4958]: I1201 10:18:29.983975 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k"] Dec 01 10:18:30 crc kubenswrapper[4958]: I1201 10:18:30.732634 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" event={"ID":"6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b","Type":"ContainerStarted","Data":"db2d345522059bb3771ab8e9fd122901e5a23b0d68692df72a04e39b49ca1d3e"} Dec 01 10:18:35 crc kubenswrapper[4958]: I1201 10:18:35.776880 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" event={"ID":"6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b","Type":"ContainerStarted","Data":"df4ac29c6ce7f2506ac1f4f60f64f003c8e8b41ea1982bc3296950cac2d75b93"} Dec 01 10:18:39 crc kubenswrapper[4958]: I1201 10:18:39.927668 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" event={"ID":"6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b","Type":"ContainerStarted","Data":"d97f3767d35d0a49c6d7b2fd6f1cf2af1ebc6dc7ffade3fce81e844464a949fd"} Dec 01 10:18:39 crc kubenswrapper[4958]: I1201 10:18:39.928345 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:39 crc kubenswrapper[4958]: I1201 10:18:39.965892 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" podStartSLOduration=2.029276193 podStartE2EDuration="10.965865526s" podCreationTimestamp="2025-12-01 10:18:29 +0000 UTC" firstStartedPulling="2025-12-01 10:18:29.994609438 +0000 UTC m=+1157.503398475" lastFinishedPulling="2025-12-01 10:18:38.931198771 +0000 UTC m=+1166.439987808" observedRunningTime="2025-12-01 10:18:39.959123983 +0000 UTC m=+1167.467913040" watchObservedRunningTime="2025-12-01 10:18:39.965865526 +0000 UTC m=+1167.474654563" Dec 01 10:18:40 crc kubenswrapper[4958]: I1201 10:18:40.936837 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5b9cbd897-lz77k" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.638267 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.640582 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.642912 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ljr9g" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.647713 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.651121 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.671589 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5z4zw" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.671822 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.681541 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.693187 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.694864 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.701994 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v2c6d" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.708017 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glgsm\" (UniqueName: \"kubernetes.io/projected/398be6e9-f887-4123-aca7-9c8a5bc68a04-kube-api-access-glgsm\") pod \"designate-operator-controller-manager-6788cc6d75-4bgbw\" (UID: \"398be6e9-f887-4123-aca7-9c8a5bc68a04\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.708170 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtnq\" (UniqueName: \"kubernetes.io/projected/a3144616-c667-4adf-a2b7-44be4eaa5e59-kube-api-access-pmtnq\") pod \"barbican-operator-controller-manager-5bfbbb859d-6ksnd\" (UID: \"a3144616-c667-4adf-a2b7-44be4eaa5e59\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.708307 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblrk\" (UniqueName: \"kubernetes.io/projected/a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a-kube-api-access-nblrk\") pod \"cinder-operator-controller-manager-748967c98-hjrzq\" (UID: \"a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.710292 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.711802 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.714274 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-crj8r" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.735546 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.760068 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.821312 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glgsm\" (UniqueName: \"kubernetes.io/projected/398be6e9-f887-4123-aca7-9c8a5bc68a04-kube-api-access-glgsm\") pod \"designate-operator-controller-manager-6788cc6d75-4bgbw\" (UID: \"398be6e9-f887-4123-aca7-9c8a5bc68a04\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.821885 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtnq\" (UniqueName: \"kubernetes.io/projected/a3144616-c667-4adf-a2b7-44be4eaa5e59-kube-api-access-pmtnq\") pod \"barbican-operator-controller-manager-5bfbbb859d-6ksnd\" (UID: \"a3144616-c667-4adf-a2b7-44be4eaa5e59\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.822066 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblrk\" (UniqueName: \"kubernetes.io/projected/a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a-kube-api-access-nblrk\") pod \"cinder-operator-controller-manager-748967c98-hjrzq\" (UID: \"a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.832881 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.834237 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.837597 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dm8c4" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.855792 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.857380 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.866083 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vpwfm" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.884149 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblrk\" (UniqueName: \"kubernetes.io/projected/a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a-kube-api-access-nblrk\") pod \"cinder-operator-controller-manager-748967c98-hjrzq\" (UID: \"a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.889363 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glgsm\" (UniqueName: \"kubernetes.io/projected/398be6e9-f887-4123-aca7-9c8a5bc68a04-kube-api-access-glgsm\") pod \"designate-operator-controller-manager-6788cc6d75-4bgbw\" (UID: \"398be6e9-f887-4123-aca7-9c8a5bc68a04\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.889988 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtnq\" (UniqueName: \"kubernetes.io/projected/a3144616-c667-4adf-a2b7-44be4eaa5e59-kube-api-access-pmtnq\") pod \"barbican-operator-controller-manager-5bfbbb859d-6ksnd\" (UID: \"a3144616-c667-4adf-a2b7-44be4eaa5e59\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.913607 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.924591 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svr2l\" (UniqueName: \"kubernetes.io/projected/dd689dc1-fc2e-4285-a863-030b9f0cc647-kube-api-access-svr2l\") pod \"glance-operator-controller-manager-6bd966bbd4-plgqm\" (UID: \"dd689dc1-fc2e-4285-a863-030b9f0cc647\") " pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.938440 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.978052 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.980475 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.981790 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.987339 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-49kbn" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.987590 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.988544 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.993920 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-5ns97"] Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.995327 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:18:59 crc kubenswrapper[4958]: I1201 10:18:59.998910 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-t9f2t" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.015755 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.016183 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.034277 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hwpn\" (UniqueName: \"kubernetes.io/projected/47b8e3cf-440c-4d73-b791-2d94c06820f2-kube-api-access-9hwpn\") pod \"horizon-operator-controller-manager-7d5d9fd47f-v5f8w\" (UID: \"47b8e3cf-440c-4d73-b791-2d94c06820f2\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.034354 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg768\" (UniqueName: \"kubernetes.io/projected/5bf0bfe3-25ac-42de-9d86-afc6772c012b-kube-api-access-fg768\") pod \"ironic-operator-controller-manager-54485f899-5ns97\" (UID: \"5bf0bfe3-25ac-42de-9d86-afc6772c012b\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.034611 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ffr\" (UniqueName: \"kubernetes.io/projected/6c4b65c1-0872-4859-903b-46556eee593e-kube-api-access-85ffr\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.034676 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svr2l\" (UniqueName: \"kubernetes.io/projected/dd689dc1-fc2e-4285-a863-030b9f0cc647-kube-api-access-svr2l\") pod \"glance-operator-controller-manager-6bd966bbd4-plgqm\" (UID: \"dd689dc1-fc2e-4285-a863-030b9f0cc647\") " pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.034706 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpvg\" (UniqueName: \"kubernetes.io/projected/84142ebc-764a-4000-90e2-f9c6588d9b43-kube-api-access-9qpvg\") pod \"heat-operator-controller-manager-698d6fd7d6-zljjp\" (UID: \"84142ebc-764a-4000-90e2-f9c6588d9b43\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.034770 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4b65c1-0872-4859-903b-46556eee593e-cert\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.048105 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-5ns97"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.068938 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.070435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.079210 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t5l6t" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.105042 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.110460 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svr2l\" (UniqueName: \"kubernetes.io/projected/dd689dc1-fc2e-4285-a863-030b9f0cc647-kube-api-access-svr2l\") pod \"glance-operator-controller-manager-6bd966bbd4-plgqm\" (UID: \"dd689dc1-fc2e-4285-a863-030b9f0cc647\") " pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.126039 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.127331 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.132866 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qxgr2" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150274 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ffr\" (UniqueName: \"kubernetes.io/projected/6c4b65c1-0872-4859-903b-46556eee593e-kube-api-access-85ffr\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150332 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpvg\" (UniqueName: \"kubernetes.io/projected/84142ebc-764a-4000-90e2-f9c6588d9b43-kube-api-access-9qpvg\") pod \"heat-operator-controller-manager-698d6fd7d6-zljjp\" (UID: \"84142ebc-764a-4000-90e2-f9c6588d9b43\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150362 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2nx\" (UniqueName: \"kubernetes.io/projected/d0c5b0b0-db70-485c-95b1-a63f980af637-kube-api-access-zt2nx\") pod \"mariadb-operator-controller-manager-64d7c556cd-xmcvh\" (UID: \"d0c5b0b0-db70-485c-95b1-a63f980af637\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150392 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4b65c1-0872-4859-903b-46556eee593e-cert\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hwpn\" (UniqueName: \"kubernetes.io/projected/47b8e3cf-440c-4d73-b791-2d94c06820f2-kube-api-access-9hwpn\") pod \"horizon-operator-controller-manager-7d5d9fd47f-v5f8w\" (UID: \"47b8e3cf-440c-4d73-b791-2d94c06820f2\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150456 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg768\" (UniqueName: \"kubernetes.io/projected/5bf0bfe3-25ac-42de-9d86-afc6772c012b-kube-api-access-fg768\") pod \"ironic-operator-controller-manager-54485f899-5ns97\" (UID: \"5bf0bfe3-25ac-42de-9d86-afc6772c012b\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.150497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b222h\" (UniqueName: \"kubernetes.io/projected/7b55e6b5-2b3c-4d13-b998-6c3c56210483-kube-api-access-b222h\") pod \"keystone-operator-controller-manager-7d6f5d799-9ln88\" (UID: \"7b55e6b5-2b3c-4d13-b998-6c3c56210483\") " pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:00 crc kubenswrapper[4958]: E1201 10:19:00.151026 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 10:19:00 crc kubenswrapper[4958]: E1201 10:19:00.151073 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c4b65c1-0872-4859-903b-46556eee593e-cert podName:6c4b65c1-0872-4859-903b-46556eee593e nodeName:}" failed. No retries permitted until 2025-12-01 10:19:00.651053005 +0000 UTC m=+1188.159842042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6c4b65c1-0872-4859-903b-46556eee593e-cert") pod "infra-operator-controller-manager-577c5f6d94-cfww7" (UID: "6c4b65c1-0872-4859-903b-46556eee593e") : secret "infra-operator-webhook-server-cert" not found Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.169032 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.188132 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg768\" (UniqueName: \"kubernetes.io/projected/5bf0bfe3-25ac-42de-9d86-afc6772c012b-kube-api-access-fg768\") pod \"ironic-operator-controller-manager-54485f899-5ns97\" (UID: \"5bf0bfe3-25ac-42de-9d86-afc6772c012b\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.190020 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.192874 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8d5tz" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.197810 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hwpn\" (UniqueName: \"kubernetes.io/projected/47b8e3cf-440c-4d73-b791-2d94c06820f2-kube-api-access-9hwpn\") pod \"horizon-operator-controller-manager-7d5d9fd47f-v5f8w\" (UID: \"47b8e3cf-440c-4d73-b791-2d94c06820f2\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.207325 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpvg\" (UniqueName: \"kubernetes.io/projected/84142ebc-764a-4000-90e2-f9c6588d9b43-kube-api-access-9qpvg\") pod \"heat-operator-controller-manager-698d6fd7d6-zljjp\" (UID: \"84142ebc-764a-4000-90e2-f9c6588d9b43\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.213084 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.213706 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ffr\" (UniqueName: \"kubernetes.io/projected/6c4b65c1-0872-4859-903b-46556eee593e-kube-api-access-85ffr\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.214909 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.229773 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.240408 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.253096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b222h\" (UniqueName: \"kubernetes.io/projected/7b55e6b5-2b3c-4d13-b998-6c3c56210483-kube-api-access-b222h\") pod \"keystone-operator-controller-manager-7d6f5d799-9ln88\" (UID: \"7b55e6b5-2b3c-4d13-b998-6c3c56210483\") " pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.253186 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6nhx\" (UniqueName: \"kubernetes.io/projected/9dad08eb-34b4-4ea4-94bb-467e5c2df8df-kube-api-access-w6nhx\") pod \"manila-operator-controller-manager-646fd589f9-csmzl\" (UID: \"9dad08eb-34b4-4ea4-94bb-467e5c2df8df\") " pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.253224 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2nx\" (UniqueName: \"kubernetes.io/projected/d0c5b0b0-db70-485c-95b1-a63f980af637-kube-api-access-zt2nx\") pod \"mariadb-operator-controller-manager-64d7c556cd-xmcvh\" (UID: \"d0c5b0b0-db70-485c-95b1-a63f980af637\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.259908 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.261920 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.271512 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.279765 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.281447 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.281736 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mtfsc" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.288015 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.290509 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mqbkv" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.301977 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.306654 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-f9l8l" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.306870 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.311181 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2nx\" (UniqueName: \"kubernetes.io/projected/d0c5b0b0-db70-485c-95b1-a63f980af637-kube-api-access-zt2nx\") pod \"mariadb-operator-controller-manager-64d7c556cd-xmcvh\" (UID: \"d0c5b0b0-db70-485c-95b1-a63f980af637\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.321998 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.323575 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b222h\" (UniqueName: \"kubernetes.io/projected/7b55e6b5-2b3c-4d13-b998-6c3c56210483-kube-api-access-b222h\") pod \"keystone-operator-controller-manager-7d6f5d799-9ln88\" (UID: \"7b55e6b5-2b3c-4d13-b998-6c3c56210483\") " pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.333134 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.337267 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.337573 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.339037 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.345259 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.453044 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.459872 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvpq\" (UniqueName: \"kubernetes.io/projected/759e8424-5b95-49fd-a80b-cb311b441b54-kube-api-access-lbvpq\") pod \"nova-operator-controller-manager-79d658b66d-8wwrb\" (UID: \"759e8424-5b95-49fd-a80b-cb311b441b54\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.459944 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6nhx\" (UniqueName: \"kubernetes.io/projected/9dad08eb-34b4-4ea4-94bb-467e5c2df8df-kube-api-access-w6nhx\") pod \"manila-operator-controller-manager-646fd589f9-csmzl\" (UID: \"9dad08eb-34b4-4ea4-94bb-467e5c2df8df\") " pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.460018 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zjm\" (UniqueName: \"kubernetes.io/projected/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-kube-api-access-94zjm\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.460050 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxx8\" (UniqueName: \"kubernetes.io/projected/de5b3e9c-1e86-4fd1-9342-df0ebcb684cb-kube-api-access-lfxx8\") pod \"octavia-operator-controller-manager-7979c68bc7-dmmb7\" (UID: \"de5b3e9c-1e86-4fd1-9342-df0ebcb684cb\") " pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.460082 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.460177 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnpwb\" (UniqueName: \"kubernetes.io/projected/a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c-kube-api-access-pnpwb\") pod \"neutron-operator-controller-manager-6b6c55ffd5-629vw\" (UID: \"a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c\") " pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.491248 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.502209 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-xkz9j" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.503177 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5l4wl" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.554637 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.613554 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-vthqj"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.652625 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.652693 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4b65c1-0872-4859-903b-46556eee593e-cert\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.653108 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnpwb\" (UniqueName: \"kubernetes.io/projected/a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c-kube-api-access-pnpwb\") pod \"neutron-operator-controller-manager-6b6c55ffd5-629vw\" (UID: \"a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c\") " pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.653262 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvpq\" (UniqueName: \"kubernetes.io/projected/759e8424-5b95-49fd-a80b-cb311b441b54-kube-api-access-lbvpq\") pod \"nova-operator-controller-manager-79d658b66d-8wwrb\" (UID: \"759e8424-5b95-49fd-a80b-cb311b441b54\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.653296 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4skkk\" (UniqueName: \"kubernetes.io/projected/3365704e-370d-4fe9-9d23-890ae1a593cc-kube-api-access-4skkk\") pod \"ovn-operator-controller-manager-5b67cfc8fb-pkd25\" (UID: \"3365704e-370d-4fe9-9d23-890ae1a593cc\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.653422 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zjm\" (UniqueName: \"kubernetes.io/projected/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-kube-api-access-94zjm\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.653445 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxx8\" (UniqueName: \"kubernetes.io/projected/de5b3e9c-1e86-4fd1-9342-df0ebcb684cb-kube-api-access-lfxx8\") pod \"octavia-operator-controller-manager-7979c68bc7-dmmb7\" (UID: \"de5b3e9c-1e86-4fd1-9342-df0ebcb684cb\") " pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:00 crc kubenswrapper[4958]: E1201 10:19:00.658534 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:19:00 crc kubenswrapper[4958]: E1201 10:19:00.659704 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-cert podName:66f0dd1b-3fde-4e90-bfbe-55e5f67b197b nodeName:}" failed. No retries permitted until 2025-12-01 10:19:01.158603257 +0000 UTC m=+1188.667392294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-cert") pod "openstack-baremetal-operator-controller-manager-77868f484-cv8pp" (UID: "66f0dd1b-3fde-4e90-bfbe-55e5f67b197b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.671398 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.678722 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4b65c1-0872-4859-903b-46556eee593e-cert\") pod \"infra-operator-controller-manager-577c5f6d94-cfww7\" (UID: \"6c4b65c1-0872-4859-903b-46556eee593e\") " pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.695158 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.696613 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.696738 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.704304 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.707079 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxx8\" (UniqueName: \"kubernetes.io/projected/de5b3e9c-1e86-4fd1-9342-df0ebcb684cb-kube-api-access-lfxx8\") pod \"octavia-operator-controller-manager-7979c68bc7-dmmb7\" (UID: \"de5b3e9c-1e86-4fd1-9342-df0ebcb684cb\") " pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.709912 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6nhx\" (UniqueName: \"kubernetes.io/projected/9dad08eb-34b4-4ea4-94bb-467e5c2df8df-kube-api-access-w6nhx\") pod \"manila-operator-controller-manager-646fd589f9-csmzl\" (UID: \"9dad08eb-34b4-4ea4-94bb-467e5c2df8df\") " pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.711326 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ttgmd" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.712378 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8ll88" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.740127 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.740221 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-vthqj"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.740241 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.756103 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb92h\" (UniqueName: \"kubernetes.io/projected/12acdf72-847b-4b01-a76a-29e0ba1958c3-kube-api-access-wb92h\") pod \"swift-operator-controller-manager-cc9f5bc5c-xtrpg\" (UID: \"12acdf72-847b-4b01-a76a-29e0ba1958c3\") " pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.756576 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4skkk\" (UniqueName: \"kubernetes.io/projected/3365704e-370d-4fe9-9d23-890ae1a593cc-kube-api-access-4skkk\") pod \"ovn-operator-controller-manager-5b67cfc8fb-pkd25\" (UID: \"3365704e-370d-4fe9-9d23-890ae1a593cc\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.756775 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s98b\" (UniqueName: \"kubernetes.io/projected/998f917b-9023-454a-b97c-3ff7948dda3a-kube-api-access-4s98b\") pod \"placement-operator-controller-manager-867d87977b-vthqj\" (UID: \"998f917b-9023-454a-b97c-3ff7948dda3a\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.775969 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.782491 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.784044 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.785378 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.786362 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qvzvv" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.790179 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-knw89" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.798458 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.809194 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2"] Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.815824 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnpwb\" (UniqueName: \"kubernetes.io/projected/a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c-kube-api-access-pnpwb\") pod \"neutron-operator-controller-manager-6b6c55ffd5-629vw\" (UID: \"a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c\") " pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.825911 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zjm\" (UniqueName: \"kubernetes.io/projected/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-kube-api-access-94zjm\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.829807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvpq\" (UniqueName: \"kubernetes.io/projected/759e8424-5b95-49fd-a80b-cb311b441b54-kube-api-access-lbvpq\") pod \"nova-operator-controller-manager-79d658b66d-8wwrb\" (UID: \"759e8424-5b95-49fd-a80b-cb311b441b54\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.830506 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4skkk\" (UniqueName: \"kubernetes.io/projected/3365704e-370d-4fe9-9d23-890ae1a593cc-kube-api-access-4skkk\") pod \"ovn-operator-controller-manager-5b67cfc8fb-pkd25\" (UID: \"3365704e-370d-4fe9-9d23-890ae1a593cc\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.840040 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.859910 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dtf\" (UniqueName: \"kubernetes.io/projected/2c2c6b14-779c-4e4f-9b83-542ccf77286c-kube-api-access-z7dtf\") pod \"test-operator-controller-manager-77db6bf9c-cqms2\" (UID: \"2c2c6b14-779c-4e4f-9b83-542ccf77286c\") " pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.859973 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f6gf\" (UniqueName: \"kubernetes.io/projected/7513d038-3642-4f27-a2df-c932dc6c9eaa-kube-api-access-5f6gf\") pod \"telemetry-operator-controller-manager-58487d9bf4-q928n\" (UID: \"7513d038-3642-4f27-a2df-c932dc6c9eaa\") " pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.860009 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s98b\" (UniqueName: \"kubernetes.io/projected/998f917b-9023-454a-b97c-3ff7948dda3a-kube-api-access-4s98b\") pod \"placement-operator-controller-manager-867d87977b-vthqj\" (UID: \"998f917b-9023-454a-b97c-3ff7948dda3a\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.860139 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb92h\" (UniqueName: \"kubernetes.io/projected/12acdf72-847b-4b01-a76a-29e0ba1958c3-kube-api-access-wb92h\") pod \"swift-operator-controller-manager-cc9f5bc5c-xtrpg\" (UID: \"12acdf72-847b-4b01-a76a-29e0ba1958c3\") " pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:00 crc kubenswrapper[4958]: I1201 10:19:00.898196 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb92h\" (UniqueName: \"kubernetes.io/projected/12acdf72-847b-4b01-a76a-29e0ba1958c3-kube-api-access-wb92h\") pod \"swift-operator-controller-manager-cc9f5bc5c-xtrpg\" (UID: \"12acdf72-847b-4b01-a76a-29e0ba1958c3\") " pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.916675 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.918142 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.922328 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mr9xj" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.925571 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s98b\" (UniqueName: \"kubernetes.io/projected/998f917b-9023-454a-b97c-3ff7948dda3a-kube-api-access-4s98b\") pod \"placement-operator-controller-manager-867d87977b-vthqj\" (UID: \"998f917b-9023-454a-b97c-3ff7948dda3a\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.940415 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.966174 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtl6r\" (UniqueName: \"kubernetes.io/projected/1c6ee3df-7b20-496f-9005-12a455dd54b9-kube-api-access-rtl6r\") pod \"watcher-operator-controller-manager-6b56b8849f-mjksk\" (UID: \"1c6ee3df-7b20-496f-9005-12a455dd54b9\") " pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.966294 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dtf\" (UniqueName: \"kubernetes.io/projected/2c2c6b14-779c-4e4f-9b83-542ccf77286c-kube-api-access-z7dtf\") pod \"test-operator-controller-manager-77db6bf9c-cqms2\" (UID: \"2c2c6b14-779c-4e4f-9b83-542ccf77286c\") " pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.966335 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f6gf\" (UniqueName: \"kubernetes.io/projected/7513d038-3642-4f27-a2df-c932dc6c9eaa-kube-api-access-5f6gf\") pod \"telemetry-operator-controller-manager-58487d9bf4-q928n\" (UID: \"7513d038-3642-4f27-a2df-c932dc6c9eaa\") " pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.984836 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f6gf\" (UniqueName: \"kubernetes.io/projected/7513d038-3642-4f27-a2df-c932dc6c9eaa-kube-api-access-5f6gf\") pod \"telemetry-operator-controller-manager-58487d9bf4-q928n\" (UID: \"7513d038-3642-4f27-a2df-c932dc6c9eaa\") " pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.993536 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dtf\" (UniqueName: \"kubernetes.io/projected/2c2c6b14-779c-4e4f-9b83-542ccf77286c-kube-api-access-z7dtf\") pod \"test-operator-controller-manager-77db6bf9c-cqms2\" (UID: \"2c2c6b14-779c-4e4f-9b83-542ccf77286c\") " pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.993621 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.996894 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:00.997009 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.002602 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5gn5f" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.002966 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.033549 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.036577 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.038751 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xjgbx" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.045073 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.068678 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtl6r\" (UniqueName: \"kubernetes.io/projected/1c6ee3df-7b20-496f-9005-12a455dd54b9-kube-api-access-rtl6r\") pod \"watcher-operator-controller-manager-6b56b8849f-mjksk\" (UID: \"1c6ee3df-7b20-496f-9005-12a455dd54b9\") " pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.077653 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.125888 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtl6r\" (UniqueName: \"kubernetes.io/projected/1c6ee3df-7b20-496f-9005-12a455dd54b9-kube-api-access-rtl6r\") pod \"watcher-operator-controller-manager-6b56b8849f-mjksk\" (UID: \"1c6ee3df-7b20-496f-9005-12a455dd54b9\") " pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.170675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.170840 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98tcv\" (UniqueName: \"kubernetes.io/projected/227e59f0-e574-4c6e-8349-8e1d648eb5f1-kube-api-access-98tcv\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.170888 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhr2\" (UniqueName: \"kubernetes.io/projected/92fdc1a0-10c5-4ff4-860c-3faef0cc6c16-kube-api-access-mxhr2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl\" (UID: \"92fdc1a0-10c5-4ff4-860c-3faef0cc6c16\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.170910 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/227e59f0-e574-4c6e-8349-8e1d648eb5f1-cert\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.261905 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66f0dd1b-3fde-4e90-bfbe-55e5f67b197b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-cv8pp\" (UID: \"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.272825 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98tcv\" (UniqueName: \"kubernetes.io/projected/227e59f0-e574-4c6e-8349-8e1d648eb5f1-kube-api-access-98tcv\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.273834 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhr2\" (UniqueName: \"kubernetes.io/projected/92fdc1a0-10c5-4ff4-860c-3faef0cc6c16-kube-api-access-mxhr2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl\" (UID: \"92fdc1a0-10c5-4ff4-860c-3faef0cc6c16\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.273879 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/227e59f0-e574-4c6e-8349-8e1d648eb5f1-cert\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: E1201 10:19:01.274215 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 10:19:01 crc kubenswrapper[4958]: E1201 10:19:01.274332 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/227e59f0-e574-4c6e-8349-8e1d648eb5f1-cert podName:227e59f0-e574-4c6e-8349-8e1d648eb5f1 nodeName:}" failed. No retries permitted until 2025-12-01 10:19:01.774306751 +0000 UTC m=+1189.283095788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/227e59f0-e574-4c6e-8349-8e1d648eb5f1-cert") pod "openstack-operator-controller-manager-6bb499f6df-kl28c" (UID: "227e59f0-e574-4c6e-8349-8e1d648eb5f1") : secret "webhook-server-cert" not found Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.275070 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.292814 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.307896 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98tcv\" (UniqueName: \"kubernetes.io/projected/227e59f0-e574-4c6e-8349-8e1d648eb5f1-kube-api-access-98tcv\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.308576 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhr2\" (UniqueName: \"kubernetes.io/projected/92fdc1a0-10c5-4ff4-860c-3faef0cc6c16-kube-api-access-mxhr2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl\" (UID: \"92fdc1a0-10c5-4ff4-860c-3faef0cc6c16\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.332459 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.396763 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.438332 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd"] Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.469601 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.506954 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:01 crc kubenswrapper[4958]: W1201 10:19:01.524994 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3144616_c667_4adf_a2b7_44be4eaa5e59.slice/crio-75241af260bfc494b4e4fc61148cc82f14afc5d6be6203ce8e7d52dd5caea691 WatchSource:0}: Error finding container 75241af260bfc494b4e4fc61148cc82f14afc5d6be6203ce8e7d52dd5caea691: Status 404 returned error can't find the container with id 75241af260bfc494b4e4fc61148cc82f14afc5d6be6203ce8e7d52dd5caea691 Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.528070 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.554624 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.579345 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.599792 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.645215 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.657785 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" event={"ID":"398be6e9-f887-4123-aca7-9c8a5bc68a04","Type":"ContainerStarted","Data":"b4ba3b778cecf8a18e4d9fff4feea145ff9014c3b31352c798a0c8773613cc97"} Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.662645 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" event={"ID":"a3144616-c667-4adf-a2b7-44be4eaa5e59","Type":"ContainerStarted","Data":"75241af260bfc494b4e4fc61148cc82f14afc5d6be6203ce8e7d52dd5caea691"} Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.689056 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.804808 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/227e59f0-e574-4c6e-8349-8e1d648eb5f1-cert\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.812193 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/227e59f0-e574-4c6e-8349-8e1d648eb5f1-cert\") pod \"openstack-operator-controller-manager-6bb499f6df-kl28c\" (UID: \"227e59f0-e574-4c6e-8349-8e1d648eb5f1\") " pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:01 crc kubenswrapper[4958]: I1201 10:19:01.965530 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:02 crc kubenswrapper[4958]: I1201 10:19:02.374443 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp"] Dec 01 10:19:02 crc kubenswrapper[4958]: I1201 10:19:02.678894 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" event={"ID":"84142ebc-764a-4000-90e2-f9c6588d9b43","Type":"ContainerStarted","Data":"050514b2bffe5f3a63e2b7a203834d10ca69f2cfa0aad85d39f2f40831072d99"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.078248 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.091620 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.104084 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.115166 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.135960 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.148749 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-5ns97"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.481983 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.489349 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.497616 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.515945 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.535592 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.548497 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-vthqj"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.548560 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.548576 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.558195 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7"] Dec 01 10:19:03 crc kubenswrapper[4958]: W1201 10:19:03.564996 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod759e8424_5b95_49fd_a80b_cb311b441b54.slice/crio-aab6ddc4f560c6aef2630c09ab08ef3ab590d4cb33a3ba5b748b1336f38b92e1 WatchSource:0}: Error finding container aab6ddc4f560c6aef2630c09ab08ef3ab590d4cb33a3ba5b748b1336f38b92e1: Status 404 returned error can't find the container with id aab6ddc4f560c6aef2630c09ab08ef3ab590d4cb33a3ba5b748b1336f38b92e1 Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.568592 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp"] Dec 01 10:19:03 crc kubenswrapper[4958]: W1201 10:19:03.577444 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7513d038_3642_4f27_a2df_c932dc6c9eaa.slice/crio-194d5da66ddea631fe8c6def6858593565ed840456b2d6d94fb00fabfacd934f WatchSource:0}: Error finding container 194d5da66ddea631fe8c6def6858593565ed840456b2d6d94fb00fabfacd934f: Status 404 returned error can't find the container with id 194d5da66ddea631fe8c6def6858593565ed840456b2d6d94fb00fabfacd934f Dec 01 10:19:03 crc kubenswrapper[4958]: W1201 10:19:03.577706 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6ee3df_7b20_496f_9005_12a455dd54b9.slice/crio-2fba7384a03c3fa53d8344b590fd143b1eab37d1b3a267dc7fc3cd654108c58b WatchSource:0}: Error finding container 2fba7384a03c3fa53d8344b590fd143b1eab37d1b3a267dc7fc3cd654108c58b: Status 404 returned error can't find the container with id 2fba7384a03c3fa53d8344b590fd143b1eab37d1b3a267dc7fc3cd654108c58b Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.583670 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb"] Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.592720 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7477e2fea70c83cfca71e1ece83bc6fdab55e890db711b0110817a5afd97c591,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5f6gf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58487d9bf4-q928n_openstack-operators(7513d038-3642-4f27-a2df-c932dc6c9eaa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.598622 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2c837009de6475bc22534827c03df6d8649277b71f1c30de2087b6c52aafb326,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4skkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5b67cfc8fb-pkd25_openstack-operators(3365704e-370d-4fe9-9d23-890ae1a593cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.602346 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rtl6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b56b8849f-mjksk_openstack-operators(1c6ee3df-7b20-496f-9005-12a455dd54b9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.639646 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:6f630b256a17a0d40ec49bbf3bfbc65118e712cafea97fb0eee03dbc037d6bf8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85ffr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-577c5f6d94-cfww7_openstack-operators(6c4b65c1-0872-4859-903b-46556eee593e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.640001 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl"] Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.644123 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zt2nx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-64d7c556cd-xmcvh_openstack-operators(d0c5b0b0-db70-485c-95b1-a63f980af637): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.644625 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxhr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl_openstack-operators(92fdc1a0-10c5-4ff4-860c-3faef0cc6c16): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.646523 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" podUID="92fdc1a0-10c5-4ff4-860c-3faef0cc6c16" Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.653365 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh"] Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.670053 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c"] Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.679228 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d1fab4998e5f0faf94295eeaebfbf6801921d50497fbfc5331a888b207831486,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w6nhx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-646fd589f9-csmzl_openstack-operators(9dad08eb-34b4-4ea4-94bb-467e5c2df8df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.696141 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" event={"ID":"227e59f0-e574-4c6e-8349-8e1d648eb5f1","Type":"ContainerStarted","Data":"d2a5bd4caa72121eabc4b1a8a8b7d7f4f4a7e9371697150a62de971714e978ba"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.698432 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" event={"ID":"a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c","Type":"ContainerStarted","Data":"b9d885379f13e1920726b44acd198fc5a069ee83a3e16be6219b6fba269a7af9"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.704916 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" event={"ID":"a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a","Type":"ContainerStarted","Data":"30e79daa3112a4383be6570a1970dc4d29f3a4025390355e5672b738fdfa6ba9"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.706806 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" event={"ID":"92fdc1a0-10c5-4ff4-860c-3faef0cc6c16","Type":"ContainerStarted","Data":"77d6ee58f266d2344dbe09139208af177914c906e7e8317db4f7ef91b11f2ee8"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.708993 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" event={"ID":"2c2c6b14-779c-4e4f-9b83-542ccf77286c","Type":"ContainerStarted","Data":"1d6042273a05eb350f5fd670f302d8bcf47454b242f0cf4cccd1437f18d1f174"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.711076 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" event={"ID":"7b55e6b5-2b3c-4d13-b998-6c3c56210483","Type":"ContainerStarted","Data":"60c3d14aa3e157cc479dadc7d6b00ea2555fd2c5f0f0e0548dcb83c514015e19"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.712380 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" event={"ID":"9dad08eb-34b4-4ea4-94bb-467e5c2df8df","Type":"ContainerStarted","Data":"c6e5ca028d3c2dd9eb1be2924a97ef2d792e8f48dd359895ad7285e925d9c40f"} Dec 01 10:19:03 crc kubenswrapper[4958]: E1201 10:19:03.735511 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" podUID="92fdc1a0-10c5-4ff4-860c-3faef0cc6c16" Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.737776 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" event={"ID":"dd689dc1-fc2e-4285-a863-030b9f0cc647","Type":"ContainerStarted","Data":"64435d27c100105911dbf4390c881b7cc1f4bc037cc6ffea5eb885e49267eb9f"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.760221 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" event={"ID":"47b8e3cf-440c-4d73-b791-2d94c06820f2","Type":"ContainerStarted","Data":"a609897422f5515b9cb833c11227dcc40fb3ab8b7f72d936c3428f5d19918bc1"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.769174 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" event={"ID":"5bf0bfe3-25ac-42de-9d86-afc6772c012b","Type":"ContainerStarted","Data":"fb2b0998abe686dd4261269bd3ff566292682be6c5b4bc8db271a72fb930e55b"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.858950 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" event={"ID":"998f917b-9023-454a-b97c-3ff7948dda3a","Type":"ContainerStarted","Data":"95d34306b2bec6a9e32c6007e933ae69be89eea5c6fe9997234f62159b04e85d"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.859002 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" event={"ID":"d0c5b0b0-db70-485c-95b1-a63f980af637","Type":"ContainerStarted","Data":"462bdf4c0aa71a4a45ac08bb44e2e519bc56ffd8b6a911870e1849f989863c6d"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.859016 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" event={"ID":"12acdf72-847b-4b01-a76a-29e0ba1958c3","Type":"ContainerStarted","Data":"ac920b806a0274bb0b1ac855f05994e5ed999dba79935387d0ed0a7204201994"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.865050 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" event={"ID":"1c6ee3df-7b20-496f-9005-12a455dd54b9","Type":"ContainerStarted","Data":"2fba7384a03c3fa53d8344b590fd143b1eab37d1b3a267dc7fc3cd654108c58b"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.905350 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" event={"ID":"6c4b65c1-0872-4859-903b-46556eee593e","Type":"ContainerStarted","Data":"44f4b0c60026c21c53e75d0d177569820acd85b82e57475b07d65835594636df"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.932257 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" event={"ID":"7513d038-3642-4f27-a2df-c932dc6c9eaa","Type":"ContainerStarted","Data":"194d5da66ddea631fe8c6def6858593565ed840456b2d6d94fb00fabfacd934f"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.954234 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" event={"ID":"3365704e-370d-4fe9-9d23-890ae1a593cc","Type":"ContainerStarted","Data":"3eebf86b0f6ab5c8c9c38d99b6a319c84e64ae0b8a082765f9ec2b2195b8bcca"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.956249 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" event={"ID":"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b","Type":"ContainerStarted","Data":"cd4c92c87cdfe64d9fb46af03bb99f16edcc4487dbfc24b082eec2061b33c965"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.957474 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" event={"ID":"de5b3e9c-1e86-4fd1-9342-df0ebcb684cb","Type":"ContainerStarted","Data":"9fafda04d411b858b1782c673a71b94917eeb916b333089509660c615262307d"} Dec 01 10:19:03 crc kubenswrapper[4958]: I1201 10:19:03.965113 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" event={"ID":"759e8424-5b95-49fd-a80b-cb311b441b54","Type":"ContainerStarted","Data":"aab6ddc4f560c6aef2630c09ab08ef3ab590d4cb33a3ba5b748b1336f38b92e1"} Dec 01 10:19:04 crc kubenswrapper[4958]: E1201 10:19:04.166353 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" podUID="1c6ee3df-7b20-496f-9005-12a455dd54b9" Dec 01 10:19:04 crc kubenswrapper[4958]: E1201 10:19:04.178460 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" podUID="3365704e-370d-4fe9-9d23-890ae1a593cc" Dec 01 10:19:04 crc kubenswrapper[4958]: E1201 10:19:04.199868 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" podUID="6c4b65c1-0872-4859-903b-46556eee593e" Dec 01 10:19:04 crc kubenswrapper[4958]: E1201 10:19:04.200569 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" podUID="7513d038-3642-4f27-a2df-c932dc6c9eaa" Dec 01 10:19:04 crc kubenswrapper[4958]: E1201 10:19:04.273590 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" podUID="9dad08eb-34b4-4ea4-94bb-467e5c2df8df" Dec 01 10:19:04 crc kubenswrapper[4958]: E1201 10:19:04.277304 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" podUID="d0c5b0b0-db70-485c-95b1-a63f980af637" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.013627 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" event={"ID":"227e59f0-e574-4c6e-8349-8e1d648eb5f1","Type":"ContainerStarted","Data":"b131c664bb4ec2a9aa7e5b63a476b622d52dddc6cf52bad7cfafd927b467a0f0"} Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.014094 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" event={"ID":"227e59f0-e574-4c6e-8349-8e1d648eb5f1","Type":"ContainerStarted","Data":"e8cd7bca4d00fd89d96b442344541ed126cc255ee15ea2c1a8dc8703db508abf"} Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.014150 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.025643 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" event={"ID":"6c4b65c1-0872-4859-903b-46556eee593e","Type":"ContainerStarted","Data":"41e1aba2d50102afb73e92f37503636f241f5f354439e00fbadcd1076d0ef9d7"} Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.055926 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:6f630b256a17a0d40ec49bbf3bfbc65118e712cafea97fb0eee03dbc037d6bf8\\\"\"" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" podUID="6c4b65c1-0872-4859-903b-46556eee593e" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.059312 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" podStartSLOduration=5.059284131 podStartE2EDuration="5.059284131s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:19:05.054448183 +0000 UTC m=+1192.563237230" watchObservedRunningTime="2025-12-01 10:19:05.059284131 +0000 UTC m=+1192.568073168" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.071443 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" event={"ID":"d0c5b0b0-db70-485c-95b1-a63f980af637","Type":"ContainerStarted","Data":"c45e857955e090b0fc421ef5f146c8b39ec9b8d1e2894d097eaab6244a0513ed"} Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.075911 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" podUID="d0c5b0b0-db70-485c-95b1-a63f980af637" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.078526 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" event={"ID":"7513d038-3642-4f27-a2df-c932dc6c9eaa","Type":"ContainerStarted","Data":"b3e4b46bf3987ee1f2545d452b7f1b8acc070a1361b60a55d8eec755eeed8ffa"} Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.082871 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7477e2fea70c83cfca71e1ece83bc6fdab55e890db711b0110817a5afd97c591\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" podUID="7513d038-3642-4f27-a2df-c932dc6c9eaa" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.091828 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" event={"ID":"3365704e-370d-4fe9-9d23-890ae1a593cc","Type":"ContainerStarted","Data":"2c1ac49cb3d02590211f4fa338bf23dbd2c1ddac978d6a302cea82d4476757e1"} Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.098333 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2c837009de6475bc22534827c03df6d8649277b71f1c30de2087b6c52aafb326\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" podUID="3365704e-370d-4fe9-9d23-890ae1a593cc" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.169743 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" event={"ID":"1c6ee3df-7b20-496f-9005-12a455dd54b9","Type":"ContainerStarted","Data":"0f93c7733df1555aed2300077e12ffe7b78321688a12b5ed7f8b15cd856f30fa"} Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.176896 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" podUID="1c6ee3df-7b20-496f-9005-12a455dd54b9" Dec 01 10:19:05 crc kubenswrapper[4958]: I1201 10:19:05.179792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" event={"ID":"9dad08eb-34b4-4ea4-94bb-467e5c2df8df","Type":"ContainerStarted","Data":"39ff6a57fdc169602aa1e66baacc54bd84308a6bf2c507986bebd1086b3e5aec"} Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.182756 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d1fab4998e5f0faf94295eeaebfbf6801921d50497fbfc5331a888b207831486\\\"\"" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" podUID="9dad08eb-34b4-4ea4-94bb-467e5c2df8df" Dec 01 10:19:05 crc kubenswrapper[4958]: E1201 10:19:05.183051 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" podUID="92fdc1a0-10c5-4ff4-860c-3faef0cc6c16" Dec 01 10:19:06 crc kubenswrapper[4958]: E1201 10:19:06.194479 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2c837009de6475bc22534827c03df6d8649277b71f1c30de2087b6c52aafb326\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" podUID="3365704e-370d-4fe9-9d23-890ae1a593cc" Dec 01 10:19:06 crc kubenswrapper[4958]: E1201 10:19:06.194833 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7477e2fea70c83cfca71e1ece83bc6fdab55e890db711b0110817a5afd97c591\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" podUID="7513d038-3642-4f27-a2df-c932dc6c9eaa" Dec 01 10:19:06 crc kubenswrapper[4958]: E1201 10:19:06.207881 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d1fab4998e5f0faf94295eeaebfbf6801921d50497fbfc5331a888b207831486\\\"\"" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" podUID="9dad08eb-34b4-4ea4-94bb-467e5c2df8df" Dec 01 10:19:06 crc kubenswrapper[4958]: E1201 10:19:06.208023 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:6f630b256a17a0d40ec49bbf3bfbc65118e712cafea97fb0eee03dbc037d6bf8\\\"\"" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" podUID="6c4b65c1-0872-4859-903b-46556eee593e" Dec 01 10:19:06 crc kubenswrapper[4958]: E1201 10:19:06.208098 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" podUID="d0c5b0b0-db70-485c-95b1-a63f980af637" Dec 01 10:19:06 crc kubenswrapper[4958]: E1201 10:19:06.208156 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" podUID="1c6ee3df-7b20-496f-9005-12a455dd54b9" Dec 01 10:19:11 crc kubenswrapper[4958]: I1201 10:19:11.988001 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6bb499f6df-kl28c" Dec 01 10:19:17 crc kubenswrapper[4958]: E1201 10:19:17.932259 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:76ad3ddd8c89748b1d9a5f3a0b2f0f47494cdb62e2997610de7febcb12970635" Dec 01 10:19:17 crc kubenswrapper[4958]: E1201 10:19:17.933357 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:76ad3ddd8c89748b1d9a5f3a0b2f0f47494cdb62e2997610de7febcb12970635,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pnpwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6b6c55ffd5-629vw_openstack-operators(a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:19:18 crc kubenswrapper[4958]: E1201 10:19:18.418195 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:bc58f62c7171e9c9216fdeafbd170917b638e6c3f842031ee254f1389c57a09e" Dec 01 10:19:18 crc kubenswrapper[4958]: E1201 10:19:18.418472 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bc58f62c7171e9c9216fdeafbd170917b638e6c3f842031ee254f1389c57a09e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wb92h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-cc9f5bc5c-xtrpg_openstack-operators(12acdf72-847b-4b01-a76a-29e0ba1958c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:19:19 crc kubenswrapper[4958]: E1201 10:19:19.076386 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:f657fa5fddbe0d7cdf889002981a743e421cfbcfb396ec38013aa511596f45ef" Dec 01 10:19:19 crc kubenswrapper[4958]: E1201 10:19:19.076702 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:f657fa5fddbe0d7cdf889002981a743e421cfbcfb396ec38013aa511596f45ef,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfxx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7979c68bc7-dmmb7_openstack-operators(de5b3e9c-1e86-4fd1-9342-df0ebcb684cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:19:19 crc kubenswrapper[4958]: E1201 10:19:19.659143 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:44c6dcec0d489a675c35e097d92729162bfc2a8cac62d7c8376943ef922e2651" Dec 01 10:19:19 crc kubenswrapper[4958]: E1201 10:19:19.659605 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:44c6dcec0d489a675c35e097d92729162bfc2a8cac62d7c8376943ef922e2651,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nblrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-748967c98-hjrzq_openstack-operators(a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:19:20 crc kubenswrapper[4958]: E1201 10:19:20.988461 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:debe5d6d29a007374b270b0e114e69b2136eee61dabab8576baf4010c951edb9" Dec 01 10:19:20 crc kubenswrapper[4958]: E1201 10:19:20.988700 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:debe5d6d29a007374b270b0e114e69b2136eee61dabab8576baf4010c951edb9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbvpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79d658b66d-8wwrb_openstack-operators(759e8424-5b95-49fd-a80b-cb311b441b54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:19:22 crc kubenswrapper[4958]: E1201 10:19:22.082331 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:624b77b1b44f5e72a6c7d5910b04eb8070c499f83dcf364fb9dc5f2f8cb83c85" Dec 01 10:19:22 crc kubenswrapper[4958]: E1201 10:19:22.082945 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:624b77b1b44f5e72a6c7d5910b04eb8070c499f83dcf364fb9dc5f2f8cb83c85,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z7dtf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-77db6bf9c-cqms2_openstack-operators(2c2c6b14-779c-4e4f-9b83-542ccf77286c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:19:24 crc kubenswrapper[4958]: E1201 10:19:24.217083 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" podUID="a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c" Dec 01 10:19:24 crc kubenswrapper[4958]: I1201 10:19:24.358649 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" event={"ID":"a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c","Type":"ContainerStarted","Data":"a1932d36215cc40a6e25a21179c043c7f751e12b7d2e8741eeb79d8bf63ae351"} Dec 01 10:19:24 crc kubenswrapper[4958]: E1201 10:19:24.360274 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:76ad3ddd8c89748b1d9a5f3a0b2f0f47494cdb62e2997610de7febcb12970635\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" podUID="a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c" Dec 01 10:19:25 crc kubenswrapper[4958]: E1201 10:19:25.373354 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:76ad3ddd8c89748b1d9a5f3a0b2f0f47494cdb62e2997610de7febcb12970635\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" podUID="a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c" Dec 01 10:19:30 crc kubenswrapper[4958]: E1201 10:19:30.289979 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" podUID="a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a" Dec 01 10:19:30 crc kubenswrapper[4958]: E1201 10:19:30.393435 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" podUID="de5b3e9c-1e86-4fd1-9342-df0ebcb684cb" Dec 01 10:19:30 crc kubenswrapper[4958]: I1201 10:19:30.414154 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" event={"ID":"de5b3e9c-1e86-4fd1-9342-df0ebcb684cb","Type":"ContainerStarted","Data":"59b6ffb9b48ffd9dcde69a759290de983c32a62135424636a117cc0891aeeda7"} Dec 01 10:19:30 crc kubenswrapper[4958]: I1201 10:19:30.421162 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" event={"ID":"a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a","Type":"ContainerStarted","Data":"2bfee9a64442da82cf95b2c9d55560fa801a03ec17268b392747320543eaac13"} Dec 01 10:19:30 crc kubenswrapper[4958]: E1201 10:19:30.492833 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" podUID="759e8424-5b95-49fd-a80b-cb311b441b54" Dec 01 10:19:30 crc kubenswrapper[4958]: E1201 10:19:30.515369 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" podUID="12acdf72-847b-4b01-a76a-29e0ba1958c3" Dec 01 10:19:30 crc kubenswrapper[4958]: E1201 10:19:30.620222 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" podUID="2c2c6b14-779c-4e4f-9b83-542ccf77286c" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.451475 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" event={"ID":"7b55e6b5-2b3c-4d13-b998-6c3c56210483","Type":"ContainerStarted","Data":"5fa19af1c13560e8eb783aef7a02c641f63e910d025d39e7afeaeab9ae0387f2"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.457199 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" event={"ID":"dd689dc1-fc2e-4285-a863-030b9f0cc647","Type":"ContainerStarted","Data":"c818368a803e2568d89bf75ce49cedcddee6a3dc179823718a44751187b7c16d"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.468878 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" event={"ID":"12acdf72-847b-4b01-a76a-29e0ba1958c3","Type":"ContainerStarted","Data":"5c5b189a12f514cdba35eb2e1b91a483b9af45b9319c3816314e2f38a780e799"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.486705 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" event={"ID":"84142ebc-764a-4000-90e2-f9c6588d9b43","Type":"ContainerStarted","Data":"245b771ab9c72a86aacf484e977d4f1434ae37a92c8bd91fe5a9c772ee79c616"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.488642 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" event={"ID":"1c6ee3df-7b20-496f-9005-12a455dd54b9","Type":"ContainerStarted","Data":"bb59c5161effca5377952230c0c02053a308adf99f737adbdeaf9a338b6442b9"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.489694 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.650037 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" event={"ID":"92fdc1a0-10c5-4ff4-860c-3faef0cc6c16","Type":"ContainerStarted","Data":"7e4e1a9b15e469cd4d23347cc438802a359d7e56780e5fcd4f61e29e675b1353"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.725425 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" event={"ID":"2c2c6b14-779c-4e4f-9b83-542ccf77286c","Type":"ContainerStarted","Data":"ebfe9f5f826f5bc877b17117c8299f1a13c310860a0b89c637d56a016e10815e"} Dec 01 10:19:31 crc kubenswrapper[4958]: E1201 10:19:31.728442 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:624b77b1b44f5e72a6c7d5910b04eb8070c499f83dcf364fb9dc5f2f8cb83c85\\\"\"" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" podUID="2c2c6b14-779c-4e4f-9b83-542ccf77286c" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.745204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" event={"ID":"7513d038-3642-4f27-a2df-c932dc6c9eaa","Type":"ContainerStarted","Data":"3b4ace8a20bbe0ed00a3774a3ddf57444bdff972dc319383ac54b0cd512b6376"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.746489 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.749893 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" event={"ID":"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b","Type":"ContainerStarted","Data":"ec53f6cac329a2d10554fa1fc49aac38ce8abfee313ec47c209d4317cf327fdb"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.769373 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" event={"ID":"759e8424-5b95-49fd-a80b-cb311b441b54","Type":"ContainerStarted","Data":"1e00ae45870bcdcd38bdf8c58158aabc3517ba8404247487bd15a287552eb919"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.787603 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" event={"ID":"47b8e3cf-440c-4d73-b791-2d94c06820f2","Type":"ContainerStarted","Data":"a0e320b61a3fe27c66d3f81250ed39f1de63bd5f1b92937222993b24d5a4074c"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.793153 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" event={"ID":"398be6e9-f887-4123-aca7-9c8a5bc68a04","Type":"ContainerStarted","Data":"9215523ee3aafd5b6639afe3c37b9fd3ef72f465b44352e3a09a4c18e694170e"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.795755 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" event={"ID":"d0c5b0b0-db70-485c-95b1-a63f980af637","Type":"ContainerStarted","Data":"b3ffbe6512c7864991261bb23d2bded3ad0feaf7d4055b73f74133cec3708f70"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.796992 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.830966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" event={"ID":"3365704e-370d-4fe9-9d23-890ae1a593cc","Type":"ContainerStarted","Data":"40d0a485c99a3587c03f92a2c32b6727c5b07cfb4de321869fda132b0c3fe4fd"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.831325 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.845583 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" podStartSLOduration=5.345281461 podStartE2EDuration="31.845554659s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.601453153 +0000 UTC m=+1191.110242190" lastFinishedPulling="2025-12-01 10:19:30.101726351 +0000 UTC m=+1217.610515388" observedRunningTime="2025-12-01 10:19:31.835914842 +0000 UTC m=+1219.344703889" watchObservedRunningTime="2025-12-01 10:19:31.845554659 +0000 UTC m=+1219.354343696" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.846756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" event={"ID":"5bf0bfe3-25ac-42de-9d86-afc6772c012b","Type":"ContainerStarted","Data":"e6c0534e28ba22d08164d6b2474c16962888c57a03a3d53f3d7d5ff80a19d71a"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.859194 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" event={"ID":"998f917b-9023-454a-b97c-3ff7948dda3a","Type":"ContainerStarted","Data":"956246dfe6fc160722812718e8f5b9c3e5994af76fc49435b2836a107dca13f5"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.874872 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" podStartSLOduration=5.290334675 podStartE2EDuration="31.874830918s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.593057332 +0000 UTC m=+1191.101846369" lastFinishedPulling="2025-12-01 10:19:30.177553575 +0000 UTC m=+1217.686342612" observedRunningTime="2025-12-01 10:19:31.86827696 +0000 UTC m=+1219.377065997" watchObservedRunningTime="2025-12-01 10:19:31.874830918 +0000 UTC m=+1219.383619955" Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.876441 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" event={"ID":"a3144616-c667-4adf-a2b7-44be4eaa5e59","Type":"ContainerStarted","Data":"78edf0bb8b26506ff3a64a9fe681077f90e11f1d6dda3e32dd47955979e943a7"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.889203 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" event={"ID":"6c4b65c1-0872-4859-903b-46556eee593e","Type":"ContainerStarted","Data":"dbaa9a5c43f47fae6988c970e9817db97eab25f53ca06fe06dc9d9ed165325a8"} Dec 01 10:19:31 crc kubenswrapper[4958]: I1201 10:19:31.892701 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:32 crc kubenswrapper[4958]: I1201 10:19:32.320225 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" podStartSLOduration=6.831879302 podStartE2EDuration="33.320200957s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.643967692 +0000 UTC m=+1191.152756729" lastFinishedPulling="2025-12-01 10:19:30.132289347 +0000 UTC m=+1217.641078384" observedRunningTime="2025-12-01 10:19:32.300189064 +0000 UTC m=+1219.808978101" watchObservedRunningTime="2025-12-01 10:19:32.320200957 +0000 UTC m=+1219.828989994" Dec 01 10:19:32 crc kubenswrapper[4958]: I1201 10:19:32.671263 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" podStartSLOduration=6.015553478 podStartE2EDuration="32.671233312s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.591888749 +0000 UTC m=+1191.100677786" lastFinishedPulling="2025-12-01 10:19:30.247568583 +0000 UTC m=+1217.756357620" observedRunningTime="2025-12-01 10:19:32.667051922 +0000 UTC m=+1220.175840959" watchObservedRunningTime="2025-12-01 10:19:32.671233312 +0000 UTC m=+1220.180022349" Dec 01 10:19:32 crc kubenswrapper[4958]: I1201 10:19:32.997864 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" event={"ID":"9dad08eb-34b4-4ea4-94bb-467e5c2df8df","Type":"ContainerStarted","Data":"5795563020ab47cd9f218f7524dd0c0b89a7dba340dba85b899bc4506ae85451"} Dec 01 10:19:32 crc kubenswrapper[4958]: I1201 10:19:32.998335 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:33 crc kubenswrapper[4958]: I1201 10:19:33.079335 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl" podStartSLOduration=6.622099468 podStartE2EDuration="33.079313982s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.644510317 +0000 UTC m=+1191.153299354" lastFinishedPulling="2025-12-01 10:19:30.101724831 +0000 UTC m=+1217.610513868" observedRunningTime="2025-12-01 10:19:33.077258864 +0000 UTC m=+1220.586047901" watchObservedRunningTime="2025-12-01 10:19:33.079313982 +0000 UTC m=+1220.588103019" Dec 01 10:19:33 crc kubenswrapper[4958]: I1201 10:19:33.149812 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" podStartSLOduration=7.727170511 podStartE2EDuration="34.149783023s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.679078488 +0000 UTC m=+1191.187867525" lastFinishedPulling="2025-12-01 10:19:30.101691 +0000 UTC m=+1217.610480037" observedRunningTime="2025-12-01 10:19:33.146600622 +0000 UTC m=+1220.655389669" watchObservedRunningTime="2025-12-01 10:19:33.149783023 +0000 UTC m=+1220.658572060" Dec 01 10:19:33 crc kubenswrapper[4958]: I1201 10:19:33.274176 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" podStartSLOduration=7.764349467 podStartE2EDuration="34.274094947s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.639126193 +0000 UTC m=+1191.147915230" lastFinishedPulling="2025-12-01 10:19:30.148871673 +0000 UTC m=+1217.657660710" observedRunningTime="2025-12-01 10:19:33.269615359 +0000 UTC m=+1220.778404396" watchObservedRunningTime="2025-12-01 10:19:33.274094947 +0000 UTC m=+1220.782883984" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.016467 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" event={"ID":"5bf0bfe3-25ac-42de-9d86-afc6772c012b","Type":"ContainerStarted","Data":"1ded21d61eed8788ff93ea824252fd96cb506411742cbb4202bfe8577f0cc022"} Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.018299 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.022531 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" event={"ID":"a3144616-c667-4adf-a2b7-44be4eaa5e59","Type":"ContainerStarted","Data":"5050a222a68baa40f65e18042ed5c8147e397d839ac05b8207b441fc2a0bcf93"} Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.023146 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.043467 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" event={"ID":"66f0dd1b-3fde-4e90-bfbe-55e5f67b197b","Type":"ContainerStarted","Data":"b1d434952516e294a70c97f062b2701b372455756c4594b02b83db863579d9be"} Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.044439 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.069084 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" event={"ID":"12acdf72-847b-4b01-a76a-29e0ba1958c3","Type":"ContainerStarted","Data":"c133250fe5ad59accb2df99203e75ca3fdb6688685e4c61b686d8f4271f56761"} Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.070275 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.090553 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" event={"ID":"a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a","Type":"ContainerStarted","Data":"37c08a6cee87be44635cb80688474fb1a563f7dc96c1d44724d61c5298058d71"} Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.090602 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.208935 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" podStartSLOduration=13.278234829 podStartE2EDuration="35.20890329s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:01.536412306 +0000 UTC m=+1189.045201343" lastFinishedPulling="2025-12-01 10:19:23.467080767 +0000 UTC m=+1210.975869804" observedRunningTime="2025-12-01 10:19:34.197952496 +0000 UTC m=+1221.706741543" watchObservedRunningTime="2025-12-01 10:19:34.20890329 +0000 UTC m=+1221.717692327" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.209400 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" podStartSLOduration=12.813969947 podStartE2EDuration="35.209390924s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.129933633 +0000 UTC m=+1190.638722670" lastFinishedPulling="2025-12-01 10:19:25.52535461 +0000 UTC m=+1213.034143647" observedRunningTime="2025-12-01 10:19:34.147111208 +0000 UTC m=+1221.655900245" watchObservedRunningTime="2025-12-01 10:19:34.209390924 +0000 UTC m=+1221.718179961" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.559409 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" podStartSLOduration=5.960512912 podStartE2EDuration="34.558935826s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.514600013 +0000 UTC m=+1191.023389050" lastFinishedPulling="2025-12-01 10:19:32.113022927 +0000 UTC m=+1219.621811964" observedRunningTime="2025-12-01 10:19:34.455338705 +0000 UTC m=+1221.964127762" watchObservedRunningTime="2025-12-01 10:19:34.558935826 +0000 UTC m=+1222.067724873" Dec 01 10:19:34 crc kubenswrapper[4958]: I1201 10:19:34.766290 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" podStartSLOduration=7.743350805 podStartE2EDuration="35.76626496s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.126731212 +0000 UTC m=+1190.635520239" lastFinishedPulling="2025-12-01 10:19:31.149645357 +0000 UTC m=+1218.658434394" observedRunningTime="2025-12-01 10:19:34.553241422 +0000 UTC m=+1222.062030459" watchObservedRunningTime="2025-12-01 10:19:34.76626496 +0000 UTC m=+1222.275053997" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.168444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" event={"ID":"7b55e6b5-2b3c-4d13-b998-6c3c56210483","Type":"ContainerStarted","Data":"0ba96dcead505e116d80d1768942dde5e8e92fa9c2bea6796753cc4f39ea1e2b"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.170942 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.174675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" event={"ID":"759e8424-5b95-49fd-a80b-cb311b441b54","Type":"ContainerStarted","Data":"1de8d0814838b7b38b7c30a76c32973d4d5fbc9cd40add8acb5c6c270ee1b7f4"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.174741 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.175793 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.178191 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" event={"ID":"dd689dc1-fc2e-4285-a863-030b9f0cc647","Type":"ContainerStarted","Data":"e5e37420f4f36162cf76a16131757cda41d87ac3941aae6da50c85b971228262"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.180607 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.185079 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" event={"ID":"47b8e3cf-440c-4d73-b791-2d94c06820f2","Type":"ContainerStarted","Data":"72c24e5b882369d60f1cd2a4c0aece0e8af5ca4617d5a1a4d501a50a18f21b02"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.185834 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.188708 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.189316 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" event={"ID":"998f917b-9023-454a-b97c-3ff7948dda3a","Type":"ContainerStarted","Data":"d0b6189ce1ca39d54262b7f41b0a329831c9609147cbf80018d9ef10c6885f57"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.191729 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.326133 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.327660 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-9ln88" podStartSLOduration=13.928097331 podStartE2EDuration="36.327632425s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.129000047 +0000 UTC m=+1190.637789084" lastFinishedPulling="2025-12-01 10:19:25.528535141 +0000 UTC m=+1213.037324178" observedRunningTime="2025-12-01 10:19:35.327048108 +0000 UTC m=+1222.835837145" watchObservedRunningTime="2025-12-01 10:19:35.327632425 +0000 UTC m=+1222.836421462" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.334698 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" event={"ID":"398be6e9-f887-4123-aca7-9c8a5bc68a04","Type":"ContainerStarted","Data":"8aef9c1ecdad93b3b524846abf93a0ff5fcd45321e532debf83c96cc151790f8"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.334774 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.344256 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" podStartSLOduration=15.083024304 podStartE2EDuration="36.34422785s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.570797244 +0000 UTC m=+1191.079586271" lastFinishedPulling="2025-12-01 10:19:24.83200078 +0000 UTC m=+1212.340789817" observedRunningTime="2025-12-01 10:19:34.77045267 +0000 UTC m=+1222.279241707" watchObservedRunningTime="2025-12-01 10:19:35.34422785 +0000 UTC m=+1222.853016887" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.425921 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.455370 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" podStartSLOduration=7.596631979 podStartE2EDuration="36.455314806s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.571387631 +0000 UTC m=+1191.080176668" lastFinishedPulling="2025-12-01 10:19:32.430070458 +0000 UTC m=+1219.938859495" observedRunningTime="2025-12-01 10:19:35.445283158 +0000 UTC m=+1222.954072195" watchObservedRunningTime="2025-12-01 10:19:35.455314806 +0000 UTC m=+1222.964103843" Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.459315 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" event={"ID":"84142ebc-764a-4000-90e2-f9c6588d9b43","Type":"ContainerStarted","Data":"825cee3cb61ffb5be18fb6d3c8885c389cfbf7c7e9372e7ed403d297d3be2e3d"} Dec 01 10:19:35 crc kubenswrapper[4958]: I1201 10:19:35.460583 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:35.476336 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:35.871674 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" event={"ID":"de5b3e9c-1e86-4fd1-9342-df0ebcb684cb","Type":"ContainerStarted","Data":"364dbe02e58cbf5560341f38440d07554ca77f0c5cf867990f3ab3e2f1d1a039"} Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:35.871733 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.804311 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" podStartSLOduration=14.845115574 podStartE2EDuration="36.804274092s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.566551882 +0000 UTC m=+1191.075340929" lastFinishedPulling="2025-12-01 10:19:25.52571042 +0000 UTC m=+1213.034499447" observedRunningTime="2025-12-01 10:19:36.788265033 +0000 UTC m=+1224.297054080" watchObservedRunningTime="2025-12-01 10:19:36.804274092 +0000 UTC m=+1224.313063129" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.811469 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" podUID="dd689dc1-fc2e-4285-a863-030b9f0cc647" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.75:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.829647 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-6ksnd" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.829987 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-54485f899-5ns97" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.830160 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-cv8pp" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.877272 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" event={"ID":"2c2c6b14-779c-4e4f-9b83-542ccf77286c","Type":"ContainerStarted","Data":"27b2fdcc05dbc6b7a8f787dbbfc70048804130f6eca0883c1db85a3d76168e0b"} Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.921609 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" Dec 01 10:19:36 crc kubenswrapper[4958]: I1201 10:19:36.971663 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-plgqm" podStartSLOduration=14.957456833 podStartE2EDuration="37.971633561s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.091979995 +0000 UTC m=+1190.600769032" lastFinishedPulling="2025-12-01 10:19:26.106156723 +0000 UTC m=+1213.614945760" observedRunningTime="2025-12-01 10:19:36.956336022 +0000 UTC m=+1224.465125059" watchObservedRunningTime="2025-12-01 10:19:36.971633561 +0000 UTC m=+1224.480422608" Dec 01 10:19:37 crc kubenswrapper[4958]: I1201 10:19:37.058110 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v5f8w" podStartSLOduration=16.335534145 podStartE2EDuration="38.058078639s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.110124926 +0000 UTC m=+1190.618913963" lastFinishedPulling="2025-12-01 10:19:24.83266942 +0000 UTC m=+1212.341458457" observedRunningTime="2025-12-01 10:19:37.008031205 +0000 UTC m=+1224.516820242" watchObservedRunningTime="2025-12-01 10:19:37.058078639 +0000 UTC m=+1224.566867676" Dec 01 10:19:37 crc kubenswrapper[4958]: I1201 10:19:37.112998 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-zljjp" podStartSLOduration=14.443089025999999 podStartE2EDuration="38.112968083s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:02.436166283 +0000 UTC m=+1189.944955320" lastFinishedPulling="2025-12-01 10:19:26.10604534 +0000 UTC m=+1213.614834377" observedRunningTime="2025-12-01 10:19:37.097398257 +0000 UTC m=+1224.606187294" watchObservedRunningTime="2025-12-01 10:19:37.112968083 +0000 UTC m=+1224.621757120" Dec 01 10:19:37 crc kubenswrapper[4958]: I1201 10:19:37.211203 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-4bgbw" podStartSLOduration=14.905477144 podStartE2EDuration="38.211181379s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:01.526258194 +0000 UTC m=+1189.035047231" lastFinishedPulling="2025-12-01 10:19:24.831962429 +0000 UTC m=+1212.340751466" observedRunningTime="2025-12-01 10:19:37.206535656 +0000 UTC m=+1224.715324683" watchObservedRunningTime="2025-12-01 10:19:37.211181379 +0000 UTC m=+1224.719970416" Dec 01 10:19:37 crc kubenswrapper[4958]: I1201 10:19:37.457378 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" podStartSLOduration=10.873945864 podStartE2EDuration="38.457352107s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.568463317 +0000 UTC m=+1191.077252354" lastFinishedPulling="2025-12-01 10:19:31.15186956 +0000 UTC m=+1218.660658597" observedRunningTime="2025-12-01 10:19:37.432300939 +0000 UTC m=+1224.941089996" watchObservedRunningTime="2025-12-01 10:19:37.457352107 +0000 UTC m=+1224.966141144" Dec 01 10:19:38 crc kubenswrapper[4958]: I1201 10:19:37.906936 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:38 crc kubenswrapper[4958]: I1201 10:19:38.255190 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" podStartSLOduration=7.625937659 podStartE2EDuration="38.25515568s" podCreationTimestamp="2025-12-01 10:19:00 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.561784605 +0000 UTC m=+1191.070573642" lastFinishedPulling="2025-12-01 10:19:34.191002626 +0000 UTC m=+1221.699791663" observedRunningTime="2025-12-01 10:19:38.235983889 +0000 UTC m=+1225.744772936" watchObservedRunningTime="2025-12-01 10:19:38.25515568 +0000 UTC m=+1225.763944717" Dec 01 10:19:38 crc kubenswrapper[4958]: I1201 10:19:38.812308 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:19:39 crc kubenswrapper[4958]: I1201 10:19:39.949928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" event={"ID":"a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c","Type":"ContainerStarted","Data":"d289151b71f2f6702219ae5c873c6a3664aef72884f0ef5300a09f6075dadd65"} Dec 01 10:19:39 crc kubenswrapper[4958]: I1201 10:19:39.950592 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:19:39 crc kubenswrapper[4958]: I1201 10:19:39.975564 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" podStartSLOduration=4.639508674 podStartE2EDuration="40.975526027s" podCreationTimestamp="2025-12-01 10:18:59 +0000 UTC" firstStartedPulling="2025-12-01 10:19:03.091259415 +0000 UTC m=+1190.600048452" lastFinishedPulling="2025-12-01 10:19:39.427276768 +0000 UTC m=+1226.936065805" observedRunningTime="2025-12-01 10:19:39.96795726 +0000 UTC m=+1227.476746307" watchObservedRunningTime="2025-12-01 10:19:39.975526027 +0000 UTC m=+1227.484315064" Dec 01 10:19:40 crc kubenswrapper[4958]: I1201 10:19:40.020359 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-748967c98-hjrzq" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.161414 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-8wwrb" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.309251 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-csmzl" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.314494 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-dmmb7" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.377473 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-577c5f6d94-cfww7" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.472719 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-pkd25" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.586471 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-xtrpg" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.588389 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-cqms2" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.588822 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-xmcvh" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.607819 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" Dec 01 10:19:41 crc kubenswrapper[4958]: I1201 10:19:41.653316 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-mjksk" Dec 01 10:19:42 crc kubenswrapper[4958]: I1201 10:19:42.626893 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-867d87977b-vthqj" podUID="998f917b-9023-454a-b97c-3ff7948dda3a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.88:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:19:50 crc kubenswrapper[4958]: I1201 10:19:50.845620 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-629vw" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.394510 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggtpt"] Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.400894 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.408367 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.409209 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.409218 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tj6rc" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.409334 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.416001 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggtpt"] Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.486661 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8lv6l"] Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.489507 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.493014 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.517372 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8lv6l"] Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.564442 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-config\") pod \"dnsmasq-dns-675f4bcbfc-ggtpt\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.564541 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-config\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.564572 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.564589 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btsqv\" (UniqueName: \"kubernetes.io/projected/0ecc4faf-d7fd-4649-8db0-9767644607b8-kube-api-access-btsqv\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.564626 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xnr\" (UniqueName: \"kubernetes.io/projected/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-kube-api-access-t2xnr\") pod \"dnsmasq-dns-675f4bcbfc-ggtpt\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.665908 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.666348 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btsqv\" (UniqueName: \"kubernetes.io/projected/0ecc4faf-d7fd-4649-8db0-9767644607b8-kube-api-access-btsqv\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.666382 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xnr\" (UniqueName: \"kubernetes.io/projected/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-kube-api-access-t2xnr\") pod \"dnsmasq-dns-675f4bcbfc-ggtpt\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.666450 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-config\") pod \"dnsmasq-dns-675f4bcbfc-ggtpt\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.666512 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-config\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.667528 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.667567 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-config\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.667896 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-config\") pod \"dnsmasq-dns-675f4bcbfc-ggtpt\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.692360 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xnr\" (UniqueName: \"kubernetes.io/projected/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-kube-api-access-t2xnr\") pod \"dnsmasq-dns-675f4bcbfc-ggtpt\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.695570 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btsqv\" (UniqueName: \"kubernetes.io/projected/0ecc4faf-d7fd-4649-8db0-9767644607b8-kube-api-access-btsqv\") pod \"dnsmasq-dns-78dd6ddcc-8lv6l\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.728756 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:20:05 crc kubenswrapper[4958]: I1201 10:20:05.824398 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:20:06 crc kubenswrapper[4958]: I1201 10:20:06.635095 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggtpt"] Dec 01 10:20:06 crc kubenswrapper[4958]: I1201 10:20:06.683737 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8lv6l"] Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.336609 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" event={"ID":"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2","Type":"ContainerStarted","Data":"7340afbd0711a30643f84443b4ab0d6d821477e5e3aa5d204226244112cc2e46"} Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.349057 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" event={"ID":"0ecc4faf-d7fd-4649-8db0-9767644607b8","Type":"ContainerStarted","Data":"bfd135f65137a0fb0a97215b299ab372bdd60a0fc30d1c57599bb597590611c3"} Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.516742 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggtpt"] Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.551274 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-npvcx"] Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.553256 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.575401 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-npvcx"] Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.631121 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4d2\" (UniqueName: \"kubernetes.io/projected/184bdef8-fe56-400a-8e89-57d5cee36495-kube-api-access-dn4d2\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.631198 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-config\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.631304 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.743749 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.743899 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4d2\" (UniqueName: \"kubernetes.io/projected/184bdef8-fe56-400a-8e89-57d5cee36495-kube-api-access-dn4d2\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.743954 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-config\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.745395 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-config\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.745690 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:07 crc kubenswrapper[4958]: I1201 10:20:07.987322 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4d2\" (UniqueName: \"kubernetes.io/projected/184bdef8-fe56-400a-8e89-57d5cee36495-kube-api-access-dn4d2\") pod \"dnsmasq-dns-5ccc8479f9-npvcx\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:08 crc kubenswrapper[4958]: I1201 10:20:08.342046 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:20:08 crc kubenswrapper[4958]: I1201 10:20:08.859513 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8lv6l"] Dec 01 10:20:08 crc kubenswrapper[4958]: I1201 10:20:08.949570 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-28pmt"] Dec 01 10:20:08 crc kubenswrapper[4958]: I1201 10:20:08.951465 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:08 crc kubenswrapper[4958]: I1201 10:20:08.956235 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-28pmt"] Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.026739 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.037751 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.052682 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.053193 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.053331 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.053440 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.053658 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.053686 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.053994 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ptgrp" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.054243 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.078127 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-config\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.078196 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.078244 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmldq\" (UniqueName: \"kubernetes.io/projected/7fadae37-e01e-4d98-8d72-b30ab5da2341-kube-api-access-hmldq\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.181979 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182418 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182473 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182507 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182533 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182555 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182573 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182612 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-config\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182637 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182706 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmldq\" (UniqueName: \"kubernetes.io/projected/7fadae37-e01e-4d98-8d72-b30ab5da2341-kube-api-access-hmldq\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182730 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182751 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.182770 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5mk\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-kube-api-access-jw5mk\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.185644 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.194099 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-config\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.220936 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmldq\" (UniqueName: \"kubernetes.io/projected/7fadae37-e01e-4d98-8d72-b30ab5da2341-kube-api-access-hmldq\") pod \"dnsmasq-dns-57d769cc4f-28pmt\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.281460 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.283802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.283902 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.283945 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.283970 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5mk\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-kube-api-access-jw5mk\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.284010 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.284046 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.284099 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.284139 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.284176 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.284208 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.564186 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.566720 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.573272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.578502 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.579089 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.579767 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.595805 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.597307 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.601780 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.618608 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.654494 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.720615 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5mk\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-kube-api-access-jw5mk\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.825748 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.981210 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-npvcx"] Dec 01 10:20:09 crc kubenswrapper[4958]: I1201 10:20:09.984197 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: W1201 10:20:10.016704 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod184bdef8_fe56_400a_8e89_57d5cee36495.slice/crio-46d803f1b7ab42cb04ee8f8549c93cda1ba360858ba38c8cd96576e46749144d WatchSource:0}: Error finding container 46d803f1b7ab42cb04ee8f8549c93cda1ba360858ba38c8cd96576e46749144d: Status 404 returned error can't find the container with id 46d803f1b7ab42cb04ee8f8549c93cda1ba360858ba38c8cd96576e46749144d Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.114570 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.116586 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.119171 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.119887 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dbv4b" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.119958 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.119957 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.120120 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.120136 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.122481 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.139113 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.434796 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-28pmt"] Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504396 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504773 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504801 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504868 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504895 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2bbc\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-kube-api-access-g2bbc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504932 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.504985 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.505147 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.505235 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.505269 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.507044 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633275 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633308 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2bbc\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-kube-api-access-g2bbc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633333 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633402 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633430 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633450 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633476 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633560 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633605 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.633633 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.634384 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.636688 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.637682 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.638130 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.638804 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.644041 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.645127 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.647228 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.676261 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.686661 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.695381 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2bbc\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-kube-api-access-g2bbc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.705281 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.746452 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.822121 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" event={"ID":"7fadae37-e01e-4d98-8d72-b30ab5da2341","Type":"ContainerStarted","Data":"64ba44959cef7f233df07406d95a823391bacfc9c1f018fc219e8db434dafdd5"} Dec 01 10:20:10 crc kubenswrapper[4958]: I1201 10:20:10.823925 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" event={"ID":"184bdef8-fe56-400a-8e89-57d5cee36495","Type":"ContainerStarted","Data":"46d803f1b7ab42cb04ee8f8549c93cda1ba360858ba38c8cd96576e46749144d"} Dec 01 10:20:11 crc kubenswrapper[4958]: I1201 10:20:11.196867 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:20:11 crc kubenswrapper[4958]: I1201 10:20:11.662807 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.060694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4fccd607-3bfb-4593-a6de-6a0fc52b34ea","Type":"ContainerStarted","Data":"b24aff2f4c8339a977a2885fd8d2bee815fddbb82c358417bca3aa703d17065f"} Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.060748 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0","Type":"ContainerStarted","Data":"cb063671ef5138a217b59e52fc226022685432b2509b16829b1a0976c6491c51"} Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.356772 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.359855 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.364926 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.365104 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-dbjnn" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.365260 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.365453 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.365862 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.377826 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.379272 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.551780 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-operator-scripts\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.551897 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-secrets\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.551984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75px8\" (UniqueName: \"kubernetes.io/projected/46e6b589-937c-42c8-8004-49e39813d622-kube-api-access-75px8\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.552024 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.552057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-kolla-config\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.552115 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.552154 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46e6b589-937c-42c8-8004-49e39813d622-config-data-generated\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.552184 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.552231 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-config-data-default\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654597 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-kolla-config\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654690 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654740 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46e6b589-937c-42c8-8004-49e39813d622-config-data-generated\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654767 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654807 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-config-data-default\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-operator-scripts\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.654974 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-secrets\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.655071 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75px8\" (UniqueName: \"kubernetes.io/projected/46e6b589-937c-42c8-8004-49e39813d622-kube-api-access-75px8\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.655117 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.658625 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46e6b589-937c-42c8-8004-49e39813d622-config-data-generated\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.659907 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-kolla-config\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.662254 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-config-data-default\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.667328 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-operator-scripts\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.677899 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.694623 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.866254 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-secrets\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.874558 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.896108 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75px8\" (UniqueName: \"kubernetes.io/projected/46e6b589-937c-42c8-8004-49e39813d622-kube-api-access-75px8\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.963197 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.969357 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.974003 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 10:20:12 crc kubenswrapper[4958]: I1201 10:20:12.975780 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-rw6ff" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.007329 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.033932 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.094009 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " pod="openstack/openstack-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.104437 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-kolla-config\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.104539 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-memcached-tls-certs\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.104579 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-combined-ca-bundle\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.104605 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2rnm\" (UniqueName: \"kubernetes.io/projected/03af271e-0af4-4681-a7b6-31b207d21143-kube-api-access-d2rnm\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.104632 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-config-data\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.207642 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-kolla-config\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.207721 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-memcached-tls-certs\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.207749 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-combined-ca-bundle\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.207770 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2rnm\" (UniqueName: \"kubernetes.io/projected/03af271e-0af4-4681-a7b6-31b207d21143-kube-api-access-d2rnm\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.207797 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-config-data\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.209039 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-config-data\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.209878 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-kolla-config\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.215014 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-memcached-tls-certs\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.262315 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2rnm\" (UniqueName: \"kubernetes.io/projected/03af271e-0af4-4681-a7b6-31b207d21143-kube-api-access-d2rnm\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.302594 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.319113 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-combined-ca-bundle\") pod \"memcached-0\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.367096 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.374074 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.378237 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.378688 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c66tb" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.379032 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.379184 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.379414 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.379829 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543501 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543572 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf6z\" (UniqueName: \"kubernetes.io/projected/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kube-api-access-cvf6z\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543635 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543672 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543714 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543743 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543771 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.543824 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.645251 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.645338 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf6z\" (UniqueName: \"kubernetes.io/projected/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kube-api-access-cvf6z\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.645380 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.645413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.645601 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.646776 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.646902 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.646947 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.646984 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.647083 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.647317 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.647938 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.649429 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.650218 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.654047 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.660943 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.669753 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.706048 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf6z\" (UniqueName: \"kubernetes.io/projected/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kube-api-access-cvf6z\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:13 crc kubenswrapper[4958]: I1201 10:20:13.755594 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.012566 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c66tb" Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.022149 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.758110 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.854285 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:20:14 crc kubenswrapper[4958]: W1201 10:20:14.917164 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46e6b589_937c_42c8_8004_49e39813d622.slice/crio-759868a2b94ce22deec852fe7f7bb39d12c949b87b6fcbb91e5bc25c27c75d38 WatchSource:0}: Error finding container 759868a2b94ce22deec852fe7f7bb39d12c949b87b6fcbb91e5bc25c27c75d38: Status 404 returned error can't find the container with id 759868a2b94ce22deec852fe7f7bb39d12c949b87b6fcbb91e5bc25c27c75d38 Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.925819 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.929532 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.935896 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fslbb" Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.954412 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:20:14 crc kubenswrapper[4958]: I1201 10:20:14.965236 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdz7t\" (UniqueName: \"kubernetes.io/projected/e0248403-94d5-4bec-9ef1-af83490d3a0e-kube-api-access-sdz7t\") pod \"kube-state-metrics-0\" (UID: \"e0248403-94d5-4bec-9ef1-af83490d3a0e\") " pod="openstack/kube-state-metrics-0" Dec 01 10:20:15 crc kubenswrapper[4958]: I1201 10:20:15.069208 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdz7t\" (UniqueName: \"kubernetes.io/projected/e0248403-94d5-4bec-9ef1-af83490d3a0e-kube-api-access-sdz7t\") pod \"kube-state-metrics-0\" (UID: \"e0248403-94d5-4bec-9ef1-af83490d3a0e\") " pod="openstack/kube-state-metrics-0" Dec 01 10:20:15 crc kubenswrapper[4958]: I1201 10:20:15.325471 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdz7t\" (UniqueName: \"kubernetes.io/projected/e0248403-94d5-4bec-9ef1-af83490d3a0e-kube-api-access-sdz7t\") pod \"kube-state-metrics-0\" (UID: \"e0248403-94d5-4bec-9ef1-af83490d3a0e\") " pod="openstack/kube-state-metrics-0" Dec 01 10:20:15 crc kubenswrapper[4958]: I1201 10:20:15.405421 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03af271e-0af4-4681-a7b6-31b207d21143","Type":"ContainerStarted","Data":"b4104edfeb976d45810f49a7db34aabba86b48b811bd03a90c03d79c8d64492c"} Dec 01 10:20:15 crc kubenswrapper[4958]: I1201 10:20:15.934335 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:20:16 crc kubenswrapper[4958]: I1201 10:20:16.040873 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:20:16 crc kubenswrapper[4958]: I1201 10:20:16.040941 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"46e6b589-937c-42c8-8004-49e39813d622","Type":"ContainerStarted","Data":"759868a2b94ce22deec852fe7f7bb39d12c949b87b6fcbb91e5bc25c27c75d38"} Dec 01 10:20:17 crc kubenswrapper[4958]: I1201 10:20:17.555703 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bed7f6be-9254-406b-9ed4-3fff3b2eb531","Type":"ContainerStarted","Data":"7ec911488c4a87ce3d94ef54cc404b05a46dcde3559bf6c90f6dd81d8a7b1ee2"} Dec 01 10:20:17 crc kubenswrapper[4958]: I1201 10:20:17.888111 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:20:17 crc kubenswrapper[4958]: W1201 10:20:17.897459 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0248403_94d5_4bec_9ef1_af83490d3a0e.slice/crio-49b377a4f9220eb6e34747df1280f6e8b08b3191f1540a45f23595ae4961d698 WatchSource:0}: Error finding container 49b377a4f9220eb6e34747df1280f6e8b08b3191f1540a45f23595ae4961d698: Status 404 returned error can't find the container with id 49b377a4f9220eb6e34747df1280f6e8b08b3191f1540a45f23595ae4961d698 Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.615619 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0248403-94d5-4bec-9ef1-af83490d3a0e","Type":"ContainerStarted","Data":"49b377a4f9220eb6e34747df1280f6e8b08b3191f1540a45f23595ae4961d698"} Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.892899 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-d9qck"] Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.894293 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.919599 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-d9qck"] Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921402 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13e880d-3817-4df9-8477-82349d7979b9-scripts\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921448 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921510 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-ovn-controller-tls-certs\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921547 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-log-ovn\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921626 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4677l\" (UniqueName: \"kubernetes.io/projected/d13e880d-3817-4df9-8477-82349d7979b9-kube-api-access-4677l\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921679 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-combined-ca-bundle\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.921719 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run-ovn\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.929562 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-d8jnd" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.929856 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 10:20:18 crc kubenswrapper[4958]: I1201 10:20:18.937737 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run-ovn\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034431 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13e880d-3817-4df9-8477-82349d7979b9-scripts\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034476 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034544 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-ovn-controller-tls-certs\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034568 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-log-ovn\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034616 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4677l\" (UniqueName: \"kubernetes.io/projected/d13e880d-3817-4df9-8477-82349d7979b9-kube-api-access-4677l\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.034670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-combined-ca-bundle\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.037594 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.037744 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run-ovn\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.037742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-log-ovn\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.040542 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13e880d-3817-4df9-8477-82349d7979b9-scripts\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.058218 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-ovn-controller-tls-certs\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.073649 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-combined-ca-bundle\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.099284 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4677l\" (UniqueName: \"kubernetes.io/projected/d13e880d-3817-4df9-8477-82349d7979b9-kube-api-access-4677l\") pod \"ovn-controller-d9qck\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.131195 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-xr8kd"] Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.149268 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xr8kd"] Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.149431 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.239068 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxws\" (UniqueName: \"kubernetes.io/projected/c01e3885-db48-42db-aa00-ca08c6839dbd-kube-api-access-fhxws\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.239160 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c01e3885-db48-42db-aa00-ca08c6839dbd-scripts\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.240899 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-log\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.240939 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-etc-ovs\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.240959 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-run\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.241003 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-lib\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.309477 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.344897 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxws\" (UniqueName: \"kubernetes.io/projected/c01e3885-db48-42db-aa00-ca08c6839dbd-kube-api-access-fhxws\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.345877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c01e3885-db48-42db-aa00-ca08c6839dbd-scripts\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.345995 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-log\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.346158 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-etc-ovs\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.346929 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-log\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.347585 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-run\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.347777 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-etc-ovs\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.348322 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-run\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.348472 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-lib\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.349014 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-lib\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.353417 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c01e3885-db48-42db-aa00-ca08c6839dbd-scripts\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.388831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxws\" (UniqueName: \"kubernetes.io/projected/c01e3885-db48-42db-aa00-ca08c6839dbd-kube-api-access-fhxws\") pod \"ovn-controller-ovs-xr8kd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:19 crc kubenswrapper[4958]: I1201 10:20:19.515418 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.350165 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.352449 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.364634 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-sp6dm" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.364812 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.365019 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.365108 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.365279 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.387328 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.595274 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597139 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597278 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597308 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597359 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597448 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597530 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-config\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.597631 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkpr5\" (UniqueName: \"kubernetes.io/projected/032af861-fde8-4b7a-929b-2ec7f5871474-kube-api-access-bkpr5\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702244 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-config\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702339 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkpr5\" (UniqueName: \"kubernetes.io/projected/032af861-fde8-4b7a-929b-2ec7f5871474-kube-api-access-bkpr5\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702371 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702420 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702467 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702491 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702511 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.702544 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.703137 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.705542 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.705733 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-config\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.705864 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.724892 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.724963 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.727592 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkpr5\" (UniqueName: \"kubernetes.io/projected/032af861-fde8-4b7a-929b-2ec7f5871474-kube-api-access-bkpr5\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.728394 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:21 crc kubenswrapper[4958]: I1201 10:20:21.873868 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:21.999751 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.268350 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-d9qck"] Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.315370 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xr8kd"] Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.707505 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.715518 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.719481 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.720744 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.721315 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dt48c" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.721502 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.728207 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916220 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916386 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916440 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916464 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-config\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916521 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916625 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:22 crc kubenswrapper[4958]: I1201 10:20:22.916685 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfjp\" (UniqueName: \"kubernetes.io/projected/debf8fae-f15c-4b09-b185-3f47c7e0491b-kube-api-access-7zfjp\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018511 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018582 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018632 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018655 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-config\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018673 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018729 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.018769 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfjp\" (UniqueName: \"kubernetes.io/projected/debf8fae-f15c-4b09-b185-3f47c7e0491b-kube-api-access-7zfjp\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.019821 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.027623 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.039213 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.039829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.043789 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.045917 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-config\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.058016 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfjp\" (UniqueName: \"kubernetes.io/projected/debf8fae-f15c-4b09-b185-3f47c7e0491b-kube-api-access-7zfjp\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.120609 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.121111 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.148973 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:23 crc kubenswrapper[4958]: I1201 10:20:23.361099 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 10:20:25 crc kubenswrapper[4958]: W1201 10:20:25.981427 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc01e3885_db48_42db_aa00_ca08c6839dbd.slice/crio-424361937c57d12afb3581265890f622e71beeb3b40a6737a81c81ebb023671d WatchSource:0}: Error finding container 424361937c57d12afb3581265890f622e71beeb3b40a6737a81c81ebb023671d: Status 404 returned error can't find the container with id 424361937c57d12afb3581265890f622e71beeb3b40a6737a81c81ebb023671d Dec 01 10:20:26 crc kubenswrapper[4958]: I1201 10:20:26.017402 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerStarted","Data":"424361937c57d12afb3581265890f622e71beeb3b40a6737a81c81ebb023671d"} Dec 01 10:20:26 crc kubenswrapper[4958]: W1201 10:20:26.020297 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd13e880d_3817_4df9_8477_82349d7979b9.slice/crio-6a491b43154cbdc222a2f41fbd71eb15ca75f7155e91f5ceae0b2424285a1200 WatchSource:0}: Error finding container 6a491b43154cbdc222a2f41fbd71eb15ca75f7155e91f5ceae0b2424285a1200: Status 404 returned error can't find the container with id 6a491b43154cbdc222a2f41fbd71eb15ca75f7155e91f5ceae0b2424285a1200 Dec 01 10:20:27 crc kubenswrapper[4958]: I1201 10:20:27.029145 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck" event={"ID":"d13e880d-3817-4df9-8477-82349d7979b9","Type":"ContainerStarted","Data":"6a491b43154cbdc222a2f41fbd71eb15ca75f7155e91f5ceae0b2424285a1200"} Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.113463 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6fscs"] Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.115598 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.126212 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6fscs"] Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.127223 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.155036 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-combined-ca-bundle\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.155121 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.155263 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7296e8da-30a1-4c69-978f-3411bda327f7-config\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.155495 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovs-rundir\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.155534 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovn-rundir\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.155591 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846sn\" (UniqueName: \"kubernetes.io/projected/7296e8da-30a1-4c69-978f-3411bda327f7-kube-api-access-846sn\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.257475 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846sn\" (UniqueName: \"kubernetes.io/projected/7296e8da-30a1-4c69-978f-3411bda327f7-kube-api-access-846sn\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.257591 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-combined-ca-bundle\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.257624 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.257646 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7296e8da-30a1-4c69-978f-3411bda327f7-config\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.257721 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovs-rundir\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.257759 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovn-rundir\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.258162 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovn-rundir\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.259313 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7296e8da-30a1-4c69-978f-3411bda327f7-config\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.259377 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovs-rundir\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.273209 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-combined-ca-bundle\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.274905 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.278501 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846sn\" (UniqueName: \"kubernetes.io/projected/7296e8da-30a1-4c69-978f-3411bda327f7-kube-api-access-846sn\") pod \"ovn-controller-metrics-6fscs\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:41 crc kubenswrapper[4958]: I1201 10:20:41.455781 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:20:42 crc kubenswrapper[4958]: E1201 10:20:42.600308 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 10:20:42 crc kubenswrapper[4958]: E1201 10:20:42.601088 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2bbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(4fccd607-3bfb-4593-a6de-6a0fc52b34ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:42 crc kubenswrapper[4958]: E1201 10:20:42.602277 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" Dec 01 10:20:43 crc kubenswrapper[4958]: E1201 10:20:43.285498 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" Dec 01 10:20:43 crc kubenswrapper[4958]: E1201 10:20:43.534124 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 01 10:20:43 crc kubenswrapper[4958]: E1201 10:20:43.534439 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n58dhbh58bhdfh56ch98h67fh77h5bch548hbch9dh5b9h58fh54bh666h89h8ch59chffhc5h575h55bh8h598h655hcfh5ddh588h659h68bh88q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d2rnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(03af271e-0af4-4681-a7b6-31b207d21143): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:43 crc kubenswrapper[4958]: E1201 10:20:43.536519 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="03af271e-0af4-4681-a7b6-31b207d21143" Dec 01 10:20:44 crc kubenswrapper[4958]: E1201 10:20:44.307369 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="03af271e-0af4-4681-a7b6-31b207d21143" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.457425 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.458211 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75px8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(46e6b589-937c-42c8-8004-49e39813d622): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.459698 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="46e6b589-937c-42c8-8004-49e39813d622" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.463501 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.463778 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jw5mk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(e19ffea8-2e96-4cff-a2ec-40646aaa4cc0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.465173 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.471525 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.471727 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cvf6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(bed7f6be-9254-406b-9ed4-3fff3b2eb531): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:45 crc kubenswrapper[4958]: E1201 10:20:45.473933 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" Dec 01 10:20:46 crc kubenswrapper[4958]: E1201 10:20:46.317759 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="46e6b589-937c-42c8-8004-49e39813d622" Dec 01 10:20:46 crc kubenswrapper[4958]: E1201 10:20:46.317753 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" Dec 01 10:20:46 crc kubenswrapper[4958]: E1201 10:20:46.317812 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" Dec 01 10:20:49 crc kubenswrapper[4958]: E1201 10:20:49.905877 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Dec 01 10:20:49 crc kubenswrapper[4958]: E1201 10:20:49.906461 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7bhd9h647h5chd8h54ch59dh5b9h75h5f4h57h659hdbh57bh59fh5b9h57h64dh647hcfh5fh67h9dh6fh686h654hb9h5d5h574h66ch589h79q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fhxws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-xr8kd_openstack(c01e3885-db48-42db-aa00-ca08c6839dbd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:49 crc kubenswrapper[4958]: E1201 10:20:49.908280 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" Dec 01 10:20:50 crc kubenswrapper[4958]: E1201 10:20:50.388764 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" Dec 01 10:20:58 crc kubenswrapper[4958]: I1201 10:20:58.213304 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:20:58 crc kubenswrapper[4958]: I1201 10:20:58.214298 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:20:58 crc kubenswrapper[4958]: E1201 10:20:58.330538 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 01 10:20:58 crc kubenswrapper[4958]: E1201 10:20:58.331273 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7bhd9h647h5chd8h54ch59dh5b9h75h5f4h57h659hdbh57bh59fh5b9h57h64dh647hcfh5fh67h9dh6fh686h654hb9h5d5h574h66ch589h79q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4677l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-d9qck_openstack(d13e880d-3817-4df9-8477-82349d7979b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:58 crc kubenswrapper[4958]: E1201 10:20:58.333111 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-d9qck" podUID="d13e880d-3817-4df9-8477-82349d7979b9" Dec 01 10:20:58 crc kubenswrapper[4958]: I1201 10:20:58.641320 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:20:58 crc kubenswrapper[4958]: I1201 10:20:58.863111 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.148593 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-d9qck" podUID="d13e880d-3817-4df9-8477-82349d7979b9" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.153094 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.153342 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dn4d2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-npvcx_openstack(184bdef8-fe56-400a-8e89-57d5cee36495): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.154570 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" podUID="184bdef8-fe56-400a-8e89-57d5cee36495" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.209156 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.209397 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmldq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-28pmt_openstack(7fadae37-e01e-4d98-8d72-b30ab5da2341): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.222575 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" podUID="7fadae37-e01e-4d98-8d72-b30ab5da2341" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.241006 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.241240 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2xnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ggtpt_openstack(8a1dc705-3ccc-475f-83bf-42ecaddbe2f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.242503 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" podUID="8a1dc705-3ccc-475f-83bf-42ecaddbe2f2" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.243509 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.244570 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btsqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-8lv6l_openstack(0ecc4faf-d7fd-4649-8db0-9767644607b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.246028 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" podUID="0ecc4faf-d7fd-4649-8db0-9767644607b8" Dec 01 10:20:59 crc kubenswrapper[4958]: W1201 10:20:59.260686 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032af861_fde8_4b7a_929b_2ec7f5871474.slice/crio-9342c765205105f9f18fc905ac3548f8db04b2aeda86e2163143ee2f096f04cd WatchSource:0}: Error finding container 9342c765205105f9f18fc905ac3548f8db04b2aeda86e2163143ee2f096f04cd: Status 404 returned error can't find the container with id 9342c765205105f9f18fc905ac3548f8db04b2aeda86e2163143ee2f096f04cd Dec 01 10:20:59 crc kubenswrapper[4958]: W1201 10:20:59.262689 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddebf8fae_f15c_4b09_b185_3f47c7e0491b.slice/crio-ecbba6144a3f24846eefb26247c695838d6f0606ba3501a3acdab236ef9721e9 WatchSource:0}: Error finding container ecbba6144a3f24846eefb26247c695838d6f0606ba3501a3acdab236ef9721e9: Status 404 returned error can't find the container with id ecbba6144a3f24846eefb26247c695838d6f0606ba3501a3acdab236ef9721e9 Dec 01 10:20:59 crc kubenswrapper[4958]: I1201 10:20:59.768991 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6fscs"] Dec 01 10:20:59 crc kubenswrapper[4958]: W1201 10:20:59.968511 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7296e8da_30a1_4c69_978f_3411bda327f7.slice/crio-5cc5f608f30fbd2b227f47bf9a0a321b3c93c5fafd8963393a306b10caf4383b WatchSource:0}: Error finding container 5cc5f608f30fbd2b227f47bf9a0a321b3c93c5fafd8963393a306b10caf4383b: Status 404 returned error can't find the container with id 5cc5f608f30fbd2b227f47bf9a0a321b3c93c5fafd8963393a306b10caf4383b Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.971172 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.971265 4958 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.971504 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sdz7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(e0248403-94d5-4bec-9ef1-af83490d3a0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 10:20:59 crc kubenswrapper[4958]: E1201 10:20:59.972756 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.143045 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03af271e-0af4-4681-a7b6-31b207d21143","Type":"ContainerStarted","Data":"4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4"} Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.143683 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.144779 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"032af861-fde8-4b7a-929b-2ec7f5871474","Type":"ContainerStarted","Data":"9342c765205105f9f18fc905ac3548f8db04b2aeda86e2163143ee2f096f04cd"} Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.146329 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"debf8fae-f15c-4b09-b185-3f47c7e0491b","Type":"ContainerStarted","Data":"ecbba6144a3f24846eefb26247c695838d6f0606ba3501a3acdab236ef9721e9"} Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.147517 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6fscs" event={"ID":"7296e8da-30a1-4c69-978f-3411bda327f7","Type":"ContainerStarted","Data":"5cc5f608f30fbd2b227f47bf9a0a321b3c93c5fafd8963393a306b10caf4383b"} Dec 01 10:21:00 crc kubenswrapper[4958]: E1201 10:21:00.149865 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" podUID="184bdef8-fe56-400a-8e89-57d5cee36495" Dec 01 10:21:00 crc kubenswrapper[4958]: E1201 10:21:00.150502 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" podUID="7fadae37-e01e-4d98-8d72-b30ab5da2341" Dec 01 10:21:00 crc kubenswrapper[4958]: E1201 10:21:00.150582 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.184083 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.466786612 podStartE2EDuration="48.184002057s" podCreationTimestamp="2025-12-01 10:20:12 +0000 UTC" firstStartedPulling="2025-12-01 10:20:14.815835853 +0000 UTC m=+1262.324624890" lastFinishedPulling="2025-12-01 10:20:59.533051298 +0000 UTC m=+1307.041840335" observedRunningTime="2025-12-01 10:21:00.178164528 +0000 UTC m=+1307.686953565" watchObservedRunningTime="2025-12-01 10:21:00.184002057 +0000 UTC m=+1307.692791094" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.809453 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.820500 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.959789 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-config\") pod \"0ecc4faf-d7fd-4649-8db0-9767644607b8\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.959938 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-config\") pod \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.960031 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btsqv\" (UniqueName: \"kubernetes.io/projected/0ecc4faf-d7fd-4649-8db0-9767644607b8-kube-api-access-btsqv\") pod \"0ecc4faf-d7fd-4649-8db0-9767644607b8\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.960054 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2xnr\" (UniqueName: \"kubernetes.io/projected/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-kube-api-access-t2xnr\") pod \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\" (UID: \"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2\") " Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.960110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-dns-svc\") pod \"0ecc4faf-d7fd-4649-8db0-9767644607b8\" (UID: \"0ecc4faf-d7fd-4649-8db0-9767644607b8\") " Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.962373 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-config" (OuterVolumeSpecName: "config") pod "0ecc4faf-d7fd-4649-8db0-9767644607b8" (UID: "0ecc4faf-d7fd-4649-8db0-9767644607b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.963892 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ecc4faf-d7fd-4649-8db0-9767644607b8" (UID: "0ecc4faf-d7fd-4649-8db0-9767644607b8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:00 crc kubenswrapper[4958]: I1201 10:21:00.964145 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-config" (OuterVolumeSpecName: "config") pod "8a1dc705-3ccc-475f-83bf-42ecaddbe2f2" (UID: "8a1dc705-3ccc-475f-83bf-42ecaddbe2f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.062479 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.062534 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ecc4faf-d7fd-4649-8db0-9767644607b8-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.062547 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.064222 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-kube-api-access-t2xnr" (OuterVolumeSpecName: "kube-api-access-t2xnr") pod "8a1dc705-3ccc-475f-83bf-42ecaddbe2f2" (UID: "8a1dc705-3ccc-475f-83bf-42ecaddbe2f2"). InnerVolumeSpecName "kube-api-access-t2xnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.065635 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecc4faf-d7fd-4649-8db0-9767644607b8-kube-api-access-btsqv" (OuterVolumeSpecName: "kube-api-access-btsqv") pod "0ecc4faf-d7fd-4649-8db0-9767644607b8" (UID: "0ecc4faf-d7fd-4649-8db0-9767644607b8"). InnerVolumeSpecName "kube-api-access-btsqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.158890 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" event={"ID":"8a1dc705-3ccc-475f-83bf-42ecaddbe2f2","Type":"ContainerDied","Data":"7340afbd0711a30643f84443b4ab0d6d821477e5e3aa5d204226244112cc2e46"} Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.159433 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ggtpt" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.164883 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btsqv\" (UniqueName: \"kubernetes.io/projected/0ecc4faf-d7fd-4649-8db0-9767644607b8-kube-api-access-btsqv\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.164953 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2xnr\" (UniqueName: \"kubernetes.io/projected/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2-kube-api-access-t2xnr\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.169147 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.176879 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8lv6l" event={"ID":"0ecc4faf-d7fd-4649-8db0-9767644607b8","Type":"ContainerDied","Data":"bfd135f65137a0fb0a97215b299ab372bdd60a0fc30d1c57599bb597590611c3"} Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.589060 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggtpt"] Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.598202 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ggtpt"] Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.632674 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8lv6l"] Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.640857 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8lv6l"] Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.815048 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecc4faf-d7fd-4649-8db0-9767644607b8" path="/var/lib/kubelet/pods/0ecc4faf-d7fd-4649-8db0-9767644607b8/volumes" Dec 01 10:21:01 crc kubenswrapper[4958]: I1201 10:21:01.815898 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a1dc705-3ccc-475f-83bf-42ecaddbe2f2" path="/var/lib/kubelet/pods/8a1dc705-3ccc-475f-83bf-42ecaddbe2f2/volumes" Dec 01 10:21:02 crc kubenswrapper[4958]: I1201 10:21:02.178444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"46e6b589-937c-42c8-8004-49e39813d622","Type":"ContainerStarted","Data":"5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862"} Dec 01 10:21:02 crc kubenswrapper[4958]: I1201 10:21:02.181345 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"debf8fae-f15c-4b09-b185-3f47c7e0491b","Type":"ContainerStarted","Data":"892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27"} Dec 01 10:21:02 crc kubenswrapper[4958]: I1201 10:21:02.183675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4fccd607-3bfb-4593-a6de-6a0fc52b34ea","Type":"ContainerStarted","Data":"a87bd53133187b0ab3e009cb4dad6caf4c75502b45000066abe1770b7f35fabb"} Dec 01 10:21:02 crc kubenswrapper[4958]: I1201 10:21:02.185148 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bed7f6be-9254-406b-9ed4-3fff3b2eb531","Type":"ContainerStarted","Data":"fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001"} Dec 01 10:21:03 crc kubenswrapper[4958]: I1201 10:21:03.194963 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0","Type":"ContainerStarted","Data":"b1abd27e2c2d388e6a6c1efd167095fea30180701b7db54c428f7e52cebd08ba"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.242948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"032af861-fde8-4b7a-929b-2ec7f5871474","Type":"ContainerStarted","Data":"7785733222ee379e13b2e08924fc0b23055d37143a5248373e74dc7a5b530c74"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.245809 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"032af861-fde8-4b7a-929b-2ec7f5871474","Type":"ContainerStarted","Data":"9f86a4fe20ec5f769cd7eb9953f1018672b10ce2765e2dc5eb42d2f7a1374f18"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.246081 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"46e6b589-937c-42c8-8004-49e39813d622","Type":"ContainerDied","Data":"5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.245806 4958 generic.go:334] "Generic (PLEG): container finished" podID="46e6b589-937c-42c8-8004-49e39813d622" containerID="5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862" exitCode=0 Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.250165 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"debf8fae-f15c-4b09-b185-3f47c7e0491b","Type":"ContainerStarted","Data":"d6ab967c227e3d5e9e63f9b31bc7367d580da34af75c1d75d3010bf7ca6e1148"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.252753 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6fscs" event={"ID":"7296e8da-30a1-4c69-978f-3411bda327f7","Type":"ContainerStarted","Data":"d39d333c260ccb07db8a021dde9b4a9a746995835c4970968df5c849398cd2e8"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.255089 4958 generic.go:334] "Generic (PLEG): container finished" podID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerID="78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083" exitCode=0 Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.255200 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerDied","Data":"78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.259357 4958 generic.go:334] "Generic (PLEG): container finished" podID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerID="fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001" exitCode=0 Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.259425 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bed7f6be-9254-406b-9ed4-3fff3b2eb531","Type":"ContainerDied","Data":"fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001"} Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.283272 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=40.718404632 podStartE2EDuration="45.283223549s" podCreationTimestamp="2025-12-01 10:20:20 +0000 UTC" firstStartedPulling="2025-12-01 10:20:59.278745978 +0000 UTC m=+1306.787535015" lastFinishedPulling="2025-12-01 10:21:03.843564895 +0000 UTC m=+1311.352353932" observedRunningTime="2025-12-01 10:21:05.269591985 +0000 UTC m=+1312.778381042" watchObservedRunningTime="2025-12-01 10:21:05.283223549 +0000 UTC m=+1312.792012586" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.304672 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6fscs" podStartSLOduration=20.345485681 podStartE2EDuration="24.304650247s" podCreationTimestamp="2025-12-01 10:20:41 +0000 UTC" firstStartedPulling="2025-12-01 10:20:59.970106653 +0000 UTC m=+1307.478895690" lastFinishedPulling="2025-12-01 10:21:03.929271219 +0000 UTC m=+1311.438060256" observedRunningTime="2025-12-01 10:21:05.298029646 +0000 UTC m=+1312.806818683" watchObservedRunningTime="2025-12-01 10:21:05.304650247 +0000 UTC m=+1312.813439284" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.362664 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.429005 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=39.580608414 podStartE2EDuration="44.428958955s" podCreationTimestamp="2025-12-01 10:20:21 +0000 UTC" firstStartedPulling="2025-12-01 10:20:59.276417001 +0000 UTC m=+1306.785206038" lastFinishedPulling="2025-12-01 10:21:04.124767542 +0000 UTC m=+1311.633556579" observedRunningTime="2025-12-01 10:21:05.357601465 +0000 UTC m=+1312.866390502" watchObservedRunningTime="2025-12-01 10:21:05.428958955 +0000 UTC m=+1312.937747982" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.473756 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.641349 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-npvcx"] Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.696890 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l7vhb"] Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.699187 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.709910 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.715756 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-config\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.715891 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.715949 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k255w\" (UniqueName: \"kubernetes.io/projected/c2ec4703-353d-4ffa-8bc5-1969c62a4299-kube-api-access-k255w\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.716023 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.740938 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l7vhb"] Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.820659 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.821402 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-config\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.821539 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.821653 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k255w\" (UniqueName: \"kubernetes.io/projected/c2ec4703-353d-4ffa-8bc5-1969c62a4299-kube-api-access-k255w\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.822017 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.822701 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-config\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.823715 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.853591 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k255w\" (UniqueName: \"kubernetes.io/projected/c2ec4703-353d-4ffa-8bc5-1969c62a4299-kube-api-access-k255w\") pod \"dnsmasq-dns-7fd796d7df-l7vhb\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.901446 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-28pmt"] Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.959532 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlpt2"] Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.966357 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.973474 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 10:21:05 crc kubenswrapper[4958]: I1201 10:21:05.987271 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlpt2"] Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.032176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-config\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.032708 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.032914 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.033091 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.033219 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8cmn\" (UniqueName: \"kubernetes.io/projected/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-kube-api-access-n8cmn\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.061178 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.135418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.135499 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.135529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8cmn\" (UniqueName: \"kubernetes.io/projected/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-kube-api-access-n8cmn\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.135590 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-config\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.135622 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.137148 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.138034 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.138650 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.139318 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-config\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.179498 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8cmn\" (UniqueName: \"kubernetes.io/projected/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-kube-api-access-n8cmn\") pod \"dnsmasq-dns-86db49b7ff-zlpt2\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.180731 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.237736 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn4d2\" (UniqueName: \"kubernetes.io/projected/184bdef8-fe56-400a-8e89-57d5cee36495-kube-api-access-dn4d2\") pod \"184bdef8-fe56-400a-8e89-57d5cee36495\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.237890 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-config\") pod \"184bdef8-fe56-400a-8e89-57d5cee36495\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.237940 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-dns-svc\") pod \"184bdef8-fe56-400a-8e89-57d5cee36495\" (UID: \"184bdef8-fe56-400a-8e89-57d5cee36495\") " Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.238778 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "184bdef8-fe56-400a-8e89-57d5cee36495" (UID: "184bdef8-fe56-400a-8e89-57d5cee36495"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.238815 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-config" (OuterVolumeSpecName: "config") pod "184bdef8-fe56-400a-8e89-57d5cee36495" (UID: "184bdef8-fe56-400a-8e89-57d5cee36495"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.242104 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184bdef8-fe56-400a-8e89-57d5cee36495-kube-api-access-dn4d2" (OuterVolumeSpecName: "kube-api-access-dn4d2") pod "184bdef8-fe56-400a-8e89-57d5cee36495" (UID: "184bdef8-fe56-400a-8e89-57d5cee36495"). InnerVolumeSpecName "kube-api-access-dn4d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.272664 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.275498 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerStarted","Data":"d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c"} Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.277419 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" event={"ID":"7fadae37-e01e-4d98-8d72-b30ab5da2341","Type":"ContainerDied","Data":"64ba44959cef7f233df07406d95a823391bacfc9c1f018fc219e8db434dafdd5"} Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.277658 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-28pmt" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.281998 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bed7f6be-9254-406b-9ed4-3fff3b2eb531","Type":"ContainerStarted","Data":"9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5"} Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.284038 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" event={"ID":"184bdef8-fe56-400a-8e89-57d5cee36495","Type":"ContainerDied","Data":"46d803f1b7ab42cb04ee8f8549c93cda1ba360858ba38c8cd96576e46749144d"} Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.284114 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-npvcx" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.301920 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"46e6b589-937c-42c8-8004-49e39813d622","Type":"ContainerStarted","Data":"2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc"} Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.302624 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.337122 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371982.517683 podStartE2EDuration="54.337093607s" podCreationTimestamp="2025-12-01 10:20:12 +0000 UTC" firstStartedPulling="2025-12-01 10:20:16.096082146 +0000 UTC m=+1263.604871183" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:21:06.332734492 +0000 UTC m=+1313.841523549" watchObservedRunningTime="2025-12-01 10:21:06.337093607 +0000 UTC m=+1313.845882644" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.337515 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.341575 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn4d2\" (UniqueName: \"kubernetes.io/projected/184bdef8-fe56-400a-8e89-57d5cee36495-kube-api-access-dn4d2\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.341633 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.341646 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/184bdef8-fe56-400a-8e89-57d5cee36495-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.379424 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.388519 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.984941296 podStartE2EDuration="55.388491711s" podCreationTimestamp="2025-12-01 10:20:11 +0000 UTC" firstStartedPulling="2025-12-01 10:20:14.937918857 +0000 UTC m=+1262.446707894" lastFinishedPulling="2025-12-01 10:21:00.341469272 +0000 UTC m=+1307.850258309" observedRunningTime="2025-12-01 10:21:06.381397576 +0000 UTC m=+1313.890186613" watchObservedRunningTime="2025-12-01 10:21:06.388491711 +0000 UTC m=+1313.897280748" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.437134 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-npvcx"] Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.443782 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmldq\" (UniqueName: \"kubernetes.io/projected/7fadae37-e01e-4d98-8d72-b30ab5da2341-kube-api-access-hmldq\") pod \"7fadae37-e01e-4d98-8d72-b30ab5da2341\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.443909 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-dns-svc\") pod \"7fadae37-e01e-4d98-8d72-b30ab5da2341\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.444027 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-config\") pod \"7fadae37-e01e-4d98-8d72-b30ab5da2341\" (UID: \"7fadae37-e01e-4d98-8d72-b30ab5da2341\") " Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.445174 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-config" (OuterVolumeSpecName: "config") pod "7fadae37-e01e-4d98-8d72-b30ab5da2341" (UID: "7fadae37-e01e-4d98-8d72-b30ab5da2341"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.446157 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.463280 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-npvcx"] Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.496565 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fadae37-e01e-4d98-8d72-b30ab5da2341" (UID: "7fadae37-e01e-4d98-8d72-b30ab5da2341"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.506730 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fadae37-e01e-4d98-8d72-b30ab5da2341-kube-api-access-hmldq" (OuterVolumeSpecName: "kube-api-access-hmldq") pod "7fadae37-e01e-4d98-8d72-b30ab5da2341" (UID: "7fadae37-e01e-4d98-8d72-b30ab5da2341"). InnerVolumeSpecName "kube-api-access-hmldq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.548864 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fadae37-e01e-4d98-8d72-b30ab5da2341-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.548903 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmldq\" (UniqueName: \"kubernetes.io/projected/7fadae37-e01e-4d98-8d72-b30ab5da2341-kube-api-access-hmldq\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.713984 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-28pmt"] Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.721902 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-28pmt"] Dec 01 10:21:06 crc kubenswrapper[4958]: I1201 10:21:06.729350 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l7vhb"] Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.001282 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.001556 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.002424 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlpt2"] Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.057634 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.310991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" event={"ID":"c2ec4703-353d-4ffa-8bc5-1969c62a4299","Type":"ContainerStarted","Data":"1775cef57d3dec0d82f8062f1da80d13134519f0d96eb6e24b2289e9c7eafd65"} Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.314540 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" event={"ID":"92f066d8-4bfd-4cf8-acfb-ec74eea4d711","Type":"ContainerStarted","Data":"de5773a9add3b1ad44fcc5a45521af4a9b7c121e1e1b95203a43285622b6a807"} Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.317946 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerStarted","Data":"dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3"} Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.353254 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-xr8kd" podStartSLOduration=9.971593681 podStartE2EDuration="48.353226607s" podCreationTimestamp="2025-12-01 10:20:19 +0000 UTC" firstStartedPulling="2025-12-01 10:20:25.98857793 +0000 UTC m=+1273.497366957" lastFinishedPulling="2025-12-01 10:21:04.370210846 +0000 UTC m=+1311.878999883" observedRunningTime="2025-12-01 10:21:07.348226613 +0000 UTC m=+1314.857015670" watchObservedRunningTime="2025-12-01 10:21:07.353226607 +0000 UTC m=+1314.862015644" Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.809755 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184bdef8-fe56-400a-8e89-57d5cee36495" path="/var/lib/kubelet/pods/184bdef8-fe56-400a-8e89-57d5cee36495/volumes" Dec 01 10:21:07 crc kubenswrapper[4958]: I1201 10:21:07.810213 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fadae37-e01e-4d98-8d72-b30ab5da2341" path="/var/lib/kubelet/pods/7fadae37-e01e-4d98-8d72-b30ab5da2341/volumes" Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.333758 4958 generic.go:334] "Generic (PLEG): container finished" podID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerID="a9abc6790ca7f6da6349c5c862d9f2d49bda9315cc349e55217a2508a621612f" exitCode=0 Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.333930 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" event={"ID":"c2ec4703-353d-4ffa-8bc5-1969c62a4299","Type":"ContainerDied","Data":"a9abc6790ca7f6da6349c5c862d9f2d49bda9315cc349e55217a2508a621612f"} Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.339026 4958 generic.go:334] "Generic (PLEG): container finished" podID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerID="b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838" exitCode=0 Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.339091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" event={"ID":"92f066d8-4bfd-4cf8-acfb-ec74eea4d711","Type":"ContainerDied","Data":"b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838"} Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.339709 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.339770 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:21:08 crc kubenswrapper[4958]: I1201 10:21:08.381614 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 10:21:09 crc kubenswrapper[4958]: I1201 10:21:09.349720 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" event={"ID":"92f066d8-4bfd-4cf8-acfb-ec74eea4d711","Type":"ContainerStarted","Data":"e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544"} Dec 01 10:21:09 crc kubenswrapper[4958]: I1201 10:21:09.350267 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:09 crc kubenswrapper[4958]: I1201 10:21:09.354044 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" event={"ID":"c2ec4703-353d-4ffa-8bc5-1969c62a4299","Type":"ContainerStarted","Data":"392456488b31ff9738b61605fcb06ffdb84ca20ae7953ccf8652801347bc5cf1"} Dec 01 10:21:09 crc kubenswrapper[4958]: I1201 10:21:09.354228 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:09 crc kubenswrapper[4958]: I1201 10:21:09.397546 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" podStartSLOduration=3.75755543 podStartE2EDuration="4.397507832s" podCreationTimestamp="2025-12-01 10:21:05 +0000 UTC" firstStartedPulling="2025-12-01 10:21:07.006814478 +0000 UTC m=+1314.515603515" lastFinishedPulling="2025-12-01 10:21:07.64676688 +0000 UTC m=+1315.155555917" observedRunningTime="2025-12-01 10:21:09.37734748 +0000 UTC m=+1316.886136517" watchObservedRunningTime="2025-12-01 10:21:09.397507832 +0000 UTC m=+1316.906296869" Dec 01 10:21:09 crc kubenswrapper[4958]: I1201 10:21:09.403656 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" podStartSLOduration=3.480999368 podStartE2EDuration="4.403634299s" podCreationTimestamp="2025-12-01 10:21:05 +0000 UTC" firstStartedPulling="2025-12-01 10:21:06.733621663 +0000 UTC m=+1314.242410700" lastFinishedPulling="2025-12-01 10:21:07.656256594 +0000 UTC m=+1315.165045631" observedRunningTime="2025-12-01 10:21:09.396129682 +0000 UTC m=+1316.904918719" watchObservedRunningTime="2025-12-01 10:21:09.403634299 +0000 UTC m=+1316.912423336" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.046635 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.415150 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.417085 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.420838 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.421232 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.421516 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.423921 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2cp6t" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.431838 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.507586 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-config\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.507691 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-scripts\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.507757 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.507862 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.507894 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.507957 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6df4\" (UniqueName: \"kubernetes.io/projected/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-kube-api-access-v6df4\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.508220 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.610207 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.610304 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-config\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.610355 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-scripts\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.610398 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.610464 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.610489 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.611296 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.611503 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-config\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.611718 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6df4\" (UniqueName: \"kubernetes.io/projected/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-kube-api-access-v6df4\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.612187 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-scripts\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.618422 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.622750 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.626734 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.630641 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6df4\" (UniqueName: \"kubernetes.io/projected/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-kube-api-access-v6df4\") pod \"ovn-northd-0\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " pod="openstack/ovn-northd-0" Dec 01 10:21:12 crc kubenswrapper[4958]: I1201 10:21:12.743296 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 10:21:13 crc kubenswrapper[4958]: I1201 10:21:13.303714 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 10:21:13 crc kubenswrapper[4958]: I1201 10:21:13.304135 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 10:21:13 crc kubenswrapper[4958]: I1201 10:21:13.402608 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 10:21:13 crc kubenswrapper[4958]: I1201 10:21:13.437597 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:21:13 crc kubenswrapper[4958]: W1201 10:21:13.441190 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ae2c66_9d6d_4bc0_b3fe_ee729225e85f.slice/crio-9bd9d65cd1a5ac8d4faaa84e09d80739d7830091d82598bf1dba2eee246b3ce6 WatchSource:0}: Error finding container 9bd9d65cd1a5ac8d4faaa84e09d80739d7830091d82598bf1dba2eee246b3ce6: Status 404 returned error can't find the container with id 9bd9d65cd1a5ac8d4faaa84e09d80739d7830091d82598bf1dba2eee246b3ce6 Dec 01 10:21:13 crc kubenswrapper[4958]: I1201 10:21:13.467988 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.023784 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.024076 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.173067 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.424895 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck" event={"ID":"d13e880d-3817-4df9-8477-82349d7979b9","Type":"ContainerStarted","Data":"fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86"} Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.426547 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-d9qck" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.430716 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0248403-94d5-4bec-9ef1-af83490d3a0e","Type":"ContainerStarted","Data":"dfc9bb1038666dfe92ff786ea1156854bbc06c6602a0a174a585d2b6f8fc8d23"} Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.431559 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.434270 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f","Type":"ContainerStarted","Data":"9bd9d65cd1a5ac8d4faaa84e09d80739d7830091d82598bf1dba2eee246b3ce6"} Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.458394 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-d9qck" podStartSLOduration=8.933059182000001 podStartE2EDuration="56.458346307s" podCreationTimestamp="2025-12-01 10:20:18 +0000 UTC" firstStartedPulling="2025-12-01 10:20:26.024121726 +0000 UTC m=+1273.532910763" lastFinishedPulling="2025-12-01 10:21:13.549408851 +0000 UTC m=+1321.058197888" observedRunningTime="2025-12-01 10:21:14.452410175 +0000 UTC m=+1321.961199212" watchObservedRunningTime="2025-12-01 10:21:14.458346307 +0000 UTC m=+1321.967135344" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.472952 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=5.110665049 podStartE2EDuration="1m0.472814644s" podCreationTimestamp="2025-12-01 10:20:14 +0000 UTC" firstStartedPulling="2025-12-01 10:20:17.9126728 +0000 UTC m=+1265.421461837" lastFinishedPulling="2025-12-01 10:21:13.274822395 +0000 UTC m=+1320.783611432" observedRunningTime="2025-12-01 10:21:14.471503776 +0000 UTC m=+1321.980292823" watchObservedRunningTime="2025-12-01 10:21:14.472814644 +0000 UTC m=+1321.981603681" Dec 01 10:21:14 crc kubenswrapper[4958]: I1201 10:21:14.517733 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.266472 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l7vhb"] Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.267288 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerName="dnsmasq-dns" containerID="cri-o://392456488b31ff9738b61605fcb06ffdb84ca20ae7953ccf8652801347bc5cf1" gracePeriod=10 Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.281148 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.315935 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-qs6kc"] Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.317988 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.346899 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qs6kc"] Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.405760 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.405819 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-config\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.405877 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-dns-svc\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.405915 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g625d\" (UniqueName: \"kubernetes.io/projected/ee13749a-e25a-439d-8da4-151757afbe82-kube-api-access-g625d\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.405944 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.445827 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f","Type":"ContainerStarted","Data":"14dbc21dd417bde9c33ac37159b79364136213cf4be75108690ff918465fc7cc"} Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.445944 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f","Type":"ContainerStarted","Data":"4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a"} Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.447208 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.452335 4958 generic.go:334] "Generic (PLEG): container finished" podID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerID="392456488b31ff9738b61605fcb06ffdb84ca20ae7953ccf8652801347bc5cf1" exitCode=0 Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.452996 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" event={"ID":"c2ec4703-353d-4ffa-8bc5-1969c62a4299","Type":"ContainerDied","Data":"392456488b31ff9738b61605fcb06ffdb84ca20ae7953ccf8652801347bc5cf1"} Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.506015 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.41726834 podStartE2EDuration="3.505975954s" podCreationTimestamp="2025-12-01 10:21:12 +0000 UTC" firstStartedPulling="2025-12-01 10:21:13.450926009 +0000 UTC m=+1320.959715046" lastFinishedPulling="2025-12-01 10:21:14.539633623 +0000 UTC m=+1322.048422660" observedRunningTime="2025-12-01 10:21:15.490558419 +0000 UTC m=+1322.999347446" watchObservedRunningTime="2025-12-01 10:21:15.505975954 +0000 UTC m=+1323.014764991" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.510327 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g625d\" (UniqueName: \"kubernetes.io/projected/ee13749a-e25a-439d-8da4-151757afbe82-kube-api-access-g625d\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.510389 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.510539 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.510592 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-config\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.510748 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-dns-svc\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.514440 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-dns-svc\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.515826 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.516037 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-config\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.516915 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.551559 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g625d\" (UniqueName: \"kubernetes.io/projected/ee13749a-e25a-439d-8da4-151757afbe82-kube-api-access-g625d\") pod \"dnsmasq-dns-698758b865-qs6kc\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.727226 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:15 crc kubenswrapper[4958]: I1201 10:21:15.875689 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.032927 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-dns-svc\") pod \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.033046 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-ovsdbserver-nb\") pod \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.033273 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-config\") pod \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.033328 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k255w\" (UniqueName: \"kubernetes.io/projected/c2ec4703-353d-4ffa-8bc5-1969c62a4299-kube-api-access-k255w\") pod \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\" (UID: \"c2ec4703-353d-4ffa-8bc5-1969c62a4299\") " Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.075172 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ec4703-353d-4ffa-8bc5-1969c62a4299-kube-api-access-k255w" (OuterVolumeSpecName: "kube-api-access-k255w") pod "c2ec4703-353d-4ffa-8bc5-1969c62a4299" (UID: "c2ec4703-353d-4ffa-8bc5-1969c62a4299"). InnerVolumeSpecName "kube-api-access-k255w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.091709 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2ec4703-353d-4ffa-8bc5-1969c62a4299" (UID: "c2ec4703-353d-4ffa-8bc5-1969c62a4299"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.098547 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-config" (OuterVolumeSpecName: "config") pod "c2ec4703-353d-4ffa-8bc5-1969c62a4299" (UID: "c2ec4703-353d-4ffa-8bc5-1969c62a4299"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.136456 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.136495 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k255w\" (UniqueName: \"kubernetes.io/projected/c2ec4703-353d-4ffa-8bc5-1969c62a4299-kube-api-access-k255w\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.136517 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.139028 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2ec4703-353d-4ffa-8bc5-1969c62a4299" (UID: "c2ec4703-353d-4ffa-8bc5-1969c62a4299"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.239631 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec4703-353d-4ffa-8bc5-1969c62a4299-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.318736 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qs6kc"] Dec 01 10:21:16 crc kubenswrapper[4958]: W1201 10:21:16.328784 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee13749a_e25a_439d_8da4_151757afbe82.slice/crio-28be34d84a8be3a62dd54c32b8ccebe657b144ad714d6da0ab68968d48421a24 WatchSource:0}: Error finding container 28be34d84a8be3a62dd54c32b8ccebe657b144ad714d6da0ab68968d48421a24: Status 404 returned error can't find the container with id 28be34d84a8be3a62dd54c32b8ccebe657b144ad714d6da0ab68968d48421a24 Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.340071 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.414138 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 01 10:21:16 crc kubenswrapper[4958]: E1201 10:21:16.443914 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerName="dnsmasq-dns" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.443968 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerName="dnsmasq-dns" Dec 01 10:21:16 crc kubenswrapper[4958]: E1201 10:21:16.443981 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerName="init" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.443987 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerName="init" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.444326 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" containerName="dnsmasq-dns" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.451089 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.454146 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.454568 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-26jvm" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.461026 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.478024 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.479523 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" event={"ID":"c2ec4703-353d-4ffa-8bc5-1969c62a4299","Type":"ContainerDied","Data":"1775cef57d3dec0d82f8062f1da80d13134519f0d96eb6e24b2289e9c7eafd65"} Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.479623 4958 scope.go:117] "RemoveContainer" containerID="392456488b31ff9738b61605fcb06ffdb84ca20ae7953ccf8652801347bc5cf1" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.479798 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l7vhb" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.490389 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.502411 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qs6kc" event={"ID":"ee13749a-e25a-439d-8da4-151757afbe82","Type":"ContainerStarted","Data":"28be34d84a8be3a62dd54c32b8ccebe657b144ad714d6da0ab68968d48421a24"} Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.548546 4958 scope.go:117] "RemoveContainer" containerID="a9abc6790ca7f6da6349c5c862d9f2d49bda9315cc349e55217a2508a621612f" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.549367 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.549682 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-cache\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.549713 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn54v\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-kube-api-access-fn54v\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.549744 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-lock\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.549772 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.560417 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l7vhb"] Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.569990 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l7vhb"] Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.652052 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-cache\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.652463 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn54v\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-kube-api-access-fn54v\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.652495 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-lock\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.652522 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.652701 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: E1201 10:21:16.652979 4958 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.652990 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-cache\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: E1201 10:21:16.653003 4958 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:21:16 crc kubenswrapper[4958]: E1201 10:21:16.653209 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift podName:13dcdb95-560c-4cef-90d8-5716e9bccf57 nodeName:}" failed. No retries permitted until 2025-12-01 10:21:17.153134686 +0000 UTC m=+1324.661923723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift") pod "swift-storage-0" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57") : configmap "swift-ring-files" not found Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.653630 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-lock\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.654105 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.680975 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:16 crc kubenswrapper[4958]: I1201 10:21:16.684212 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn54v\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-kube-api-access-fn54v\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.034134 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jkdpw"] Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.036289 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.039503 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.039738 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.042678 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.049873 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jkdpw"] Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.170684 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-etc-swift\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.170791 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-combined-ca-bundle\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.170931 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.170997 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk44r\" (UniqueName: \"kubernetes.io/projected/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-kube-api-access-jk44r\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.171034 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-dispersionconf\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.171137 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-scripts\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.171177 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-swiftconf\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.171219 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-ring-data-devices\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: E1201 10:21:17.171362 4958 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:21:17 crc kubenswrapper[4958]: E1201 10:21:17.171429 4958 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:21:17 crc kubenswrapper[4958]: E1201 10:21:17.171537 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift podName:13dcdb95-560c-4cef-90d8-5716e9bccf57 nodeName:}" failed. No retries permitted until 2025-12-01 10:21:18.171499128 +0000 UTC m=+1325.680288185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift") pod "swift-storage-0" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57") : configmap "swift-ring-files" not found Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.272753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-dispersionconf\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.272891 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-scripts\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.272920 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-swiftconf\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.272946 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-ring-data-devices\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.272982 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-etc-swift\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.273016 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-combined-ca-bundle\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.273076 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk44r\" (UniqueName: \"kubernetes.io/projected/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-kube-api-access-jk44r\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.274161 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-etc-swift\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.274489 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-scripts\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.274624 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-ring-data-devices\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.278549 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-swiftconf\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.279338 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-dispersionconf\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.280574 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-combined-ca-bundle\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.293056 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk44r\" (UniqueName: \"kubernetes.io/projected/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-kube-api-access-jk44r\") pod \"swift-ring-rebalance-jkdpw\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.355367 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.571574 4958 generic.go:334] "Generic (PLEG): container finished" podID="ee13749a-e25a-439d-8da4-151757afbe82" containerID="1780ad2c68c6bed13e8f4a981f509569f90328850865c3a3d0e0ac2efc955cd1" exitCode=0 Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.573745 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qs6kc" event={"ID":"ee13749a-e25a-439d-8da4-151757afbe82","Type":"ContainerDied","Data":"1780ad2c68c6bed13e8f4a981f509569f90328850865c3a3d0e0ac2efc955cd1"} Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.828504 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ec4703-353d-4ffa-8bc5-1969c62a4299" path="/var/lib/kubelet/pods/c2ec4703-353d-4ffa-8bc5-1969c62a4299/volumes" Dec 01 10:21:17 crc kubenswrapper[4958]: I1201 10:21:17.982400 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jkdpw"] Dec 01 10:21:17 crc kubenswrapper[4958]: W1201 10:21:17.984311 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6404e65b_16c6_4bd0_a4fe_26ad44d722c6.slice/crio-4514be35a35e7bac11bf9f97d94ffa13f629236d3c044fb38935dafdda4d53b6 WatchSource:0}: Error finding container 4514be35a35e7bac11bf9f97d94ffa13f629236d3c044fb38935dafdda4d53b6: Status 404 returned error can't find the container with id 4514be35a35e7bac11bf9f97d94ffa13f629236d3c044fb38935dafdda4d53b6 Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.192090 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:18 crc kubenswrapper[4958]: E1201 10:21:18.192396 4958 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:21:18 crc kubenswrapper[4958]: E1201 10:21:18.192442 4958 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:21:18 crc kubenswrapper[4958]: E1201 10:21:18.192525 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift podName:13dcdb95-560c-4cef-90d8-5716e9bccf57 nodeName:}" failed. No retries permitted until 2025-12-01 10:21:20.192499108 +0000 UTC m=+1327.701288145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift") pod "swift-storage-0" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57") : configmap "swift-ring-files" not found Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.581095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkdpw" event={"ID":"6404e65b-16c6-4bd0-a4fe-26ad44d722c6","Type":"ContainerStarted","Data":"4514be35a35e7bac11bf9f97d94ffa13f629236d3c044fb38935dafdda4d53b6"} Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.583482 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qs6kc" event={"ID":"ee13749a-e25a-439d-8da4-151757afbe82","Type":"ContainerStarted","Data":"c77aa12b5c87710bd0d9b31aec69c271fb2357ed323a268969ec00abc58208e7"} Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.583790 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.608464 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-qs6kc" podStartSLOduration=3.608436624 podStartE2EDuration="3.608436624s" podCreationTimestamp="2025-12-01 10:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:21:18.605105797 +0000 UTC m=+1326.113894834" watchObservedRunningTime="2025-12-01 10:21:18.608436624 +0000 UTC m=+1326.117225661" Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.766250 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9ccsx"] Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.767756 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.793020 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9ccsx"] Dec 01 10:21:18 crc kubenswrapper[4958]: I1201 10:21:18.911498 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ccsr\" (UniqueName: \"kubernetes.io/projected/dc87cf0c-982e-44d5-adc8-2e9827faa501-kube-api-access-4ccsr\") pod \"glance-db-create-9ccsx\" (UID: \"dc87cf0c-982e-44d5-adc8-2e9827faa501\") " pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:19 crc kubenswrapper[4958]: I1201 10:21:19.013413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ccsr\" (UniqueName: \"kubernetes.io/projected/dc87cf0c-982e-44d5-adc8-2e9827faa501-kube-api-access-4ccsr\") pod \"glance-db-create-9ccsx\" (UID: \"dc87cf0c-982e-44d5-adc8-2e9827faa501\") " pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:19 crc kubenswrapper[4958]: I1201 10:21:19.045036 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ccsr\" (UniqueName: \"kubernetes.io/projected/dc87cf0c-982e-44d5-adc8-2e9827faa501-kube-api-access-4ccsr\") pod \"glance-db-create-9ccsx\" (UID: \"dc87cf0c-982e-44d5-adc8-2e9827faa501\") " pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:19 crc kubenswrapper[4958]: I1201 10:21:19.094260 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:19 crc kubenswrapper[4958]: I1201 10:21:19.696805 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9ccsx"] Dec 01 10:21:20 crc kubenswrapper[4958]: I1201 10:21:20.263347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:20 crc kubenswrapper[4958]: E1201 10:21:20.263723 4958 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:21:20 crc kubenswrapper[4958]: E1201 10:21:20.263982 4958 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:21:20 crc kubenswrapper[4958]: E1201 10:21:20.264044 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift podName:13dcdb95-560c-4cef-90d8-5716e9bccf57 nodeName:}" failed. No retries permitted until 2025-12-01 10:21:24.264024079 +0000 UTC m=+1331.772813126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift") pod "swift-storage-0" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57") : configmap "swift-ring-files" not found Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.573925 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sx6jg"] Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.575794 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.648254 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sx6jg"] Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.738317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxql\" (UniqueName: \"kubernetes.io/projected/311e4fde-9406-4820-bb28-3988a095f565-kube-api-access-kmxql\") pod \"keystone-db-create-sx6jg\" (UID: \"311e4fde-9406-4820-bb28-3988a095f565\") " pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.841925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxql\" (UniqueName: \"kubernetes.io/projected/311e4fde-9406-4820-bb28-3988a095f565-kube-api-access-kmxql\") pod \"keystone-db-create-sx6jg\" (UID: \"311e4fde-9406-4820-bb28-3988a095f565\") " pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.864828 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxql\" (UniqueName: \"kubernetes.io/projected/311e4fde-9406-4820-bb28-3988a095f565-kube-api-access-kmxql\") pod \"keystone-db-create-sx6jg\" (UID: \"311e4fde-9406-4820-bb28-3988a095f565\") " pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:22 crc kubenswrapper[4958]: I1201 10:21:22.950424 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.150029 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-d7bmj"] Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.151773 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.167694 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d7bmj"] Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.250945 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvzc2\" (UniqueName: \"kubernetes.io/projected/f31601d4-2e94-4471-9216-e0769fddfb89-kube-api-access-xvzc2\") pod \"placement-db-create-d7bmj\" (UID: \"f31601d4-2e94-4471-9216-e0769fddfb89\") " pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.352419 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvzc2\" (UniqueName: \"kubernetes.io/projected/f31601d4-2e94-4471-9216-e0769fddfb89-kube-api-access-xvzc2\") pod \"placement-db-create-d7bmj\" (UID: \"f31601d4-2e94-4471-9216-e0769fddfb89\") " pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.374145 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvzc2\" (UniqueName: \"kubernetes.io/projected/f31601d4-2e94-4471-9216-e0769fddfb89-kube-api-access-xvzc2\") pod \"placement-db-create-d7bmj\" (UID: \"f31601d4-2e94-4471-9216-e0769fddfb89\") " pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:23 crc kubenswrapper[4958]: I1201 10:21:23.482116 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:24 crc kubenswrapper[4958]: I1201 10:21:24.298364 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:24 crc kubenswrapper[4958]: E1201 10:21:24.298770 4958 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:21:24 crc kubenswrapper[4958]: E1201 10:21:24.299021 4958 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:21:24 crc kubenswrapper[4958]: E1201 10:21:24.299126 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift podName:13dcdb95-560c-4cef-90d8-5716e9bccf57 nodeName:}" failed. No retries permitted until 2025-12-01 10:21:32.299091596 +0000 UTC m=+1339.807880643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift") pod "swift-storage-0" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57") : configmap "swift-ring-files" not found Dec 01 10:21:24 crc kubenswrapper[4958]: W1201 10:21:24.448760 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc87cf0c_982e_44d5_adc8_2e9827faa501.slice/crio-3710de40f9727ad6fbf8b27eeee199b935428b4dd09c226c64e34477f45335f5 WatchSource:0}: Error finding container 3710de40f9727ad6fbf8b27eeee199b935428b4dd09c226c64e34477f45335f5: Status 404 returned error can't find the container with id 3710de40f9727ad6fbf8b27eeee199b935428b4dd09c226c64e34477f45335f5 Dec 01 10:21:24 crc kubenswrapper[4958]: I1201 10:21:24.667538 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9ccsx" event={"ID":"dc87cf0c-982e-44d5-adc8-2e9827faa501","Type":"ContainerStarted","Data":"3710de40f9727ad6fbf8b27eeee199b935428b4dd09c226c64e34477f45335f5"} Dec 01 10:21:24 crc kubenswrapper[4958]: W1201 10:21:24.994388 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311e4fde_9406_4820_bb28_3988a095f565.slice/crio-ade88677ec9aef18b45fd559b56e4aa7aac7eb4d2e3c6ff12d03be811ab581fd WatchSource:0}: Error finding container ade88677ec9aef18b45fd559b56e4aa7aac7eb4d2e3c6ff12d03be811ab581fd: Status 404 returned error can't find the container with id ade88677ec9aef18b45fd559b56e4aa7aac7eb4d2e3c6ff12d03be811ab581fd Dec 01 10:21:24 crc kubenswrapper[4958]: I1201 10:21:24.996769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sx6jg"] Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.150999 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-d7bmj"] Dec 01 10:21:25 crc kubenswrapper[4958]: W1201 10:21:25.159410 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf31601d4_2e94_4471_9216_e0769fddfb89.slice/crio-b172dc5aa0c7ab4c12d1d6548aec11de678da827e0a3fd28963d1b45ef48132d WatchSource:0}: Error finding container b172dc5aa0c7ab4c12d1d6548aec11de678da827e0a3fd28963d1b45ef48132d: Status 404 returned error can't find the container with id b172dc5aa0c7ab4c12d1d6548aec11de678da827e0a3fd28963d1b45ef48132d Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.686295 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkdpw" event={"ID":"6404e65b-16c6-4bd0-a4fe-26ad44d722c6","Type":"ContainerStarted","Data":"1398aa8836f87ff8e2050f7a48826330e2d07470c0e7bec124b310a45ec5904e"} Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.689507 4958 generic.go:334] "Generic (PLEG): container finished" podID="dc87cf0c-982e-44d5-adc8-2e9827faa501" containerID="c5e27fb4d7e81848fd212dbd23876fccd05db89724a17e1dceaecd9899ecd408" exitCode=0 Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.689578 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9ccsx" event={"ID":"dc87cf0c-982e-44d5-adc8-2e9827faa501","Type":"ContainerDied","Data":"c5e27fb4d7e81848fd212dbd23876fccd05db89724a17e1dceaecd9899ecd408"} Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.692888 4958 generic.go:334] "Generic (PLEG): container finished" podID="311e4fde-9406-4820-bb28-3988a095f565" containerID="deb1c57b48cdb68fd86978602ee1e6c2e55a1fe310735f3a32fa71f717d847f5" exitCode=0 Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.693035 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sx6jg" event={"ID":"311e4fde-9406-4820-bb28-3988a095f565","Type":"ContainerDied","Data":"deb1c57b48cdb68fd86978602ee1e6c2e55a1fe310735f3a32fa71f717d847f5"} Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.693075 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sx6jg" event={"ID":"311e4fde-9406-4820-bb28-3988a095f565","Type":"ContainerStarted","Data":"ade88677ec9aef18b45fd559b56e4aa7aac7eb4d2e3c6ff12d03be811ab581fd"} Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.696464 4958 generic.go:334] "Generic (PLEG): container finished" podID="f31601d4-2e94-4471-9216-e0769fddfb89" containerID="ca54c1a38a9a153d513651af777d3c608aa78a98d64d4cff903ae8750c918ff9" exitCode=0 Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.696504 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d7bmj" event={"ID":"f31601d4-2e94-4471-9216-e0769fddfb89","Type":"ContainerDied","Data":"ca54c1a38a9a153d513651af777d3c608aa78a98d64d4cff903ae8750c918ff9"} Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.696603 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d7bmj" event={"ID":"f31601d4-2e94-4471-9216-e0769fddfb89","Type":"ContainerStarted","Data":"b172dc5aa0c7ab4c12d1d6548aec11de678da827e0a3fd28963d1b45ef48132d"} Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.730351 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.748350 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jkdpw" podStartSLOduration=2.204247051 podStartE2EDuration="8.748316927s" podCreationTimestamp="2025-12-01 10:21:17 +0000 UTC" firstStartedPulling="2025-12-01 10:21:17.986959995 +0000 UTC m=+1325.495749032" lastFinishedPulling="2025-12-01 10:21:24.531029861 +0000 UTC m=+1332.039818908" observedRunningTime="2025-12-01 10:21:25.712727889 +0000 UTC m=+1333.221516946" watchObservedRunningTime="2025-12-01 10:21:25.748316927 +0000 UTC m=+1333.257105964" Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.873002 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlpt2"] Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.873636 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="dnsmasq-dns" containerID="cri-o://e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544" gracePeriod=10 Dec 01 10:21:25 crc kubenswrapper[4958]: I1201 10:21:25.946967 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.426908 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.551731 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-dns-svc\") pod \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.551782 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-config\") pod \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.551847 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8cmn\" (UniqueName: \"kubernetes.io/projected/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-kube-api-access-n8cmn\") pod \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.552095 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-sb\") pod \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.552122 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-nb\") pod \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\" (UID: \"92f066d8-4bfd-4cf8-acfb-ec74eea4d711\") " Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.560516 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-kube-api-access-n8cmn" (OuterVolumeSpecName: "kube-api-access-n8cmn") pod "92f066d8-4bfd-4cf8-acfb-ec74eea4d711" (UID: "92f066d8-4bfd-4cf8-acfb-ec74eea4d711"). InnerVolumeSpecName "kube-api-access-n8cmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.602035 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92f066d8-4bfd-4cf8-acfb-ec74eea4d711" (UID: "92f066d8-4bfd-4cf8-acfb-ec74eea4d711"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.606305 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92f066d8-4bfd-4cf8-acfb-ec74eea4d711" (UID: "92f066d8-4bfd-4cf8-acfb-ec74eea4d711"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.609988 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-config" (OuterVolumeSpecName: "config") pod "92f066d8-4bfd-4cf8-acfb-ec74eea4d711" (UID: "92f066d8-4bfd-4cf8-acfb-ec74eea4d711"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.613369 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92f066d8-4bfd-4cf8-acfb-ec74eea4d711" (UID: "92f066d8-4bfd-4cf8-acfb-ec74eea4d711"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.654213 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.654443 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.654525 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.654588 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.654686 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8cmn\" (UniqueName: \"kubernetes.io/projected/92f066d8-4bfd-4cf8-acfb-ec74eea4d711-kube-api-access-n8cmn\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.708066 4958 generic.go:334] "Generic (PLEG): container finished" podID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerID="e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544" exitCode=0 Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.708111 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" event={"ID":"92f066d8-4bfd-4cf8-acfb-ec74eea4d711","Type":"ContainerDied","Data":"e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544"} Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.708176 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" event={"ID":"92f066d8-4bfd-4cf8-acfb-ec74eea4d711","Type":"ContainerDied","Data":"de5773a9add3b1ad44fcc5a45521af4a9b7c121e1e1b95203a43285622b6a807"} Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.708198 4958 scope.go:117] "RemoveContainer" containerID="e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.709791 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.745865 4958 scope.go:117] "RemoveContainer" containerID="b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.770711 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlpt2"] Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.781390 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlpt2"] Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.784016 4958 scope.go:117] "RemoveContainer" containerID="e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544" Dec 01 10:21:26 crc kubenswrapper[4958]: E1201 10:21:26.785090 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544\": container with ID starting with e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544 not found: ID does not exist" containerID="e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.785205 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544"} err="failed to get container status \"e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544\": rpc error: code = NotFound desc = could not find container \"e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544\": container with ID starting with e8c7aebdc268c0b777b9bca63baa44d4ee305059a3c0a42260f0ed48f251c544 not found: ID does not exist" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.785632 4958 scope.go:117] "RemoveContainer" containerID="b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838" Dec 01 10:21:26 crc kubenswrapper[4958]: E1201 10:21:26.787624 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838\": container with ID starting with b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838 not found: ID does not exist" containerID="b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838" Dec 01 10:21:26 crc kubenswrapper[4958]: I1201 10:21:26.787669 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838"} err="failed to get container status \"b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838\": rpc error: code = NotFound desc = could not find container \"b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838\": container with ID starting with b10444e917805d8fe8442d87ea4e187d6abeab662bb369fd4f542afe9f9c9838 not found: ID does not exist" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.244054 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.251196 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.256861 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.371069 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxql\" (UniqueName: \"kubernetes.io/projected/311e4fde-9406-4820-bb28-3988a095f565-kube-api-access-kmxql\") pod \"311e4fde-9406-4820-bb28-3988a095f565\" (UID: \"311e4fde-9406-4820-bb28-3988a095f565\") " Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.371162 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvzc2\" (UniqueName: \"kubernetes.io/projected/f31601d4-2e94-4471-9216-e0769fddfb89-kube-api-access-xvzc2\") pod \"f31601d4-2e94-4471-9216-e0769fddfb89\" (UID: \"f31601d4-2e94-4471-9216-e0769fddfb89\") " Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.371370 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ccsr\" (UniqueName: \"kubernetes.io/projected/dc87cf0c-982e-44d5-adc8-2e9827faa501-kube-api-access-4ccsr\") pod \"dc87cf0c-982e-44d5-adc8-2e9827faa501\" (UID: \"dc87cf0c-982e-44d5-adc8-2e9827faa501\") " Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.375852 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311e4fde-9406-4820-bb28-3988a095f565-kube-api-access-kmxql" (OuterVolumeSpecName: "kube-api-access-kmxql") pod "311e4fde-9406-4820-bb28-3988a095f565" (UID: "311e4fde-9406-4820-bb28-3988a095f565"). InnerVolumeSpecName "kube-api-access-kmxql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.378301 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31601d4-2e94-4471-9216-e0769fddfb89-kube-api-access-xvzc2" (OuterVolumeSpecName: "kube-api-access-xvzc2") pod "f31601d4-2e94-4471-9216-e0769fddfb89" (UID: "f31601d4-2e94-4471-9216-e0769fddfb89"). InnerVolumeSpecName "kube-api-access-xvzc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.378386 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc87cf0c-982e-44d5-adc8-2e9827faa501-kube-api-access-4ccsr" (OuterVolumeSpecName: "kube-api-access-4ccsr") pod "dc87cf0c-982e-44d5-adc8-2e9827faa501" (UID: "dc87cf0c-982e-44d5-adc8-2e9827faa501"). InnerVolumeSpecName "kube-api-access-4ccsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.473843 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ccsr\" (UniqueName: \"kubernetes.io/projected/dc87cf0c-982e-44d5-adc8-2e9827faa501-kube-api-access-4ccsr\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.473915 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmxql\" (UniqueName: \"kubernetes.io/projected/311e4fde-9406-4820-bb28-3988a095f565-kube-api-access-kmxql\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.473929 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvzc2\" (UniqueName: \"kubernetes.io/projected/f31601d4-2e94-4471-9216-e0769fddfb89-kube-api-access-xvzc2\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.717537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9ccsx" event={"ID":"dc87cf0c-982e-44d5-adc8-2e9827faa501","Type":"ContainerDied","Data":"3710de40f9727ad6fbf8b27eeee199b935428b4dd09c226c64e34477f45335f5"} Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.717580 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9ccsx" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.717615 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3710de40f9727ad6fbf8b27eeee199b935428b4dd09c226c64e34477f45335f5" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.719329 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sx6jg" event={"ID":"311e4fde-9406-4820-bb28-3988a095f565","Type":"ContainerDied","Data":"ade88677ec9aef18b45fd559b56e4aa7aac7eb4d2e3c6ff12d03be811ab581fd"} Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.719390 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade88677ec9aef18b45fd559b56e4aa7aac7eb4d2e3c6ff12d03be811ab581fd" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.719394 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sx6jg" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.720978 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-d7bmj" event={"ID":"f31601d4-2e94-4471-9216-e0769fddfb89","Type":"ContainerDied","Data":"b172dc5aa0c7ab4c12d1d6548aec11de678da827e0a3fd28963d1b45ef48132d"} Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.721015 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-d7bmj" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.721011 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b172dc5aa0c7ab4c12d1d6548aec11de678da827e0a3fd28963d1b45ef48132d" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.891884 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" path="/var/lib/kubelet/pods/92f066d8-4bfd-4cf8-acfb-ec74eea4d711/volumes" Dec 01 10:21:27 crc kubenswrapper[4958]: I1201 10:21:27.893493 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 10:21:28 crc kubenswrapper[4958]: I1201 10:21:28.210410 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:21:28 crc kubenswrapper[4958]: I1201 10:21:28.210519 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:21:31 crc kubenswrapper[4958]: I1201 10:21:31.339281 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-zlpt2" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.335311 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.335760 4958 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.335824 4958 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.335981 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift podName:13dcdb95-560c-4cef-90d8-5716e9bccf57 nodeName:}" failed. No retries permitted until 2025-12-01 10:21:48.335949191 +0000 UTC m=+1355.844738238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift") pod "swift-storage-0" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57") : configmap "swift-ring-files" not found Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.858811 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-99a7-account-create-c9sr8"] Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.859406 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31601d4-2e94-4471-9216-e0769fddfb89" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859431 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31601d4-2e94-4471-9216-e0769fddfb89" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.859463 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc87cf0c-982e-44d5-adc8-2e9827faa501" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859472 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc87cf0c-982e-44d5-adc8-2e9827faa501" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.859486 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311e4fde-9406-4820-bb28-3988a095f565" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859495 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="311e4fde-9406-4820-bb28-3988a095f565" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.859513 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="init" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859520 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="init" Dec 01 10:21:32 crc kubenswrapper[4958]: E1201 10:21:32.859530 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="dnsmasq-dns" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859537 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="dnsmasq-dns" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859956 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f066d8-4bfd-4cf8-acfb-ec74eea4d711" containerName="dnsmasq-dns" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.859997 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc87cf0c-982e-44d5-adc8-2e9827faa501" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.860012 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31601d4-2e94-4471-9216-e0769fddfb89" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.860038 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="311e4fde-9406-4820-bb28-3988a095f565" containerName="mariadb-database-create" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.861090 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.865197 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.875563 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-99a7-account-create-c9sr8"] Dec 01 10:21:32 crc kubenswrapper[4958]: I1201 10:21:32.896564 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfc4d\" (UniqueName: \"kubernetes.io/projected/6d2854e0-7961-48f8-adb0-2118d1685a38-kube-api-access-kfc4d\") pod \"keystone-99a7-account-create-c9sr8\" (UID: \"6d2854e0-7961-48f8-adb0-2118d1685a38\") " pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:32.999786 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfc4d\" (UniqueName: \"kubernetes.io/projected/6d2854e0-7961-48f8-adb0-2118d1685a38-kube-api-access-kfc4d\") pod \"keystone-99a7-account-create-c9sr8\" (UID: \"6d2854e0-7961-48f8-adb0-2118d1685a38\") " pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.022319 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfc4d\" (UniqueName: \"kubernetes.io/projected/6d2854e0-7961-48f8-adb0-2118d1685a38-kube-api-access-kfc4d\") pod \"keystone-99a7-account-create-c9sr8\" (UID: \"6d2854e0-7961-48f8-adb0-2118d1685a38\") " pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.185044 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.691507 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-99a7-account-create-c9sr8"] Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.815672 4958 generic.go:334] "Generic (PLEG): container finished" podID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerID="a87bd53133187b0ab3e009cb4dad6caf4c75502b45000066abe1770b7f35fabb" exitCode=0 Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.815749 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4fccd607-3bfb-4593-a6de-6a0fc52b34ea","Type":"ContainerDied","Data":"a87bd53133187b0ab3e009cb4dad6caf4c75502b45000066abe1770b7f35fabb"} Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.820399 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-99a7-account-create-c9sr8" event={"ID":"6d2854e0-7961-48f8-adb0-2118d1685a38","Type":"ContainerStarted","Data":"d4a396ca78178b183b6fe7d1eff41a1222a17ecc477f33aa18a71bcf50398d04"} Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.823491 4958 generic.go:334] "Generic (PLEG): container finished" podID="6404e65b-16c6-4bd0-a4fe-26ad44d722c6" containerID="1398aa8836f87ff8e2050f7a48826330e2d07470c0e7bec124b310a45ec5904e" exitCode=0 Dec 01 10:21:33 crc kubenswrapper[4958]: I1201 10:21:33.823604 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkdpw" event={"ID":"6404e65b-16c6-4bd0-a4fe-26ad44d722c6","Type":"ContainerDied","Data":"1398aa8836f87ff8e2050f7a48826330e2d07470c0e7bec124b310a45ec5904e"} Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.833675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4fccd607-3bfb-4593-a6de-6a0fc52b34ea","Type":"ContainerStarted","Data":"79456c806f5f410ac0967e765960f7c80fe55e590cd1656e2879adefde292303"} Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.834775 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.835377 4958 generic.go:334] "Generic (PLEG): container finished" podID="6d2854e0-7961-48f8-adb0-2118d1685a38" containerID="78e6e5e6b60b39e59394d92a770491f46e85b9abae994ce35591f156c97b948a" exitCode=0 Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.835451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-99a7-account-create-c9sr8" event={"ID":"6d2854e0-7961-48f8-adb0-2118d1685a38","Type":"ContainerDied","Data":"78e6e5e6b60b39e59394d92a770491f46e85b9abae994ce35591f156c97b948a"} Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.839470 4958 generic.go:334] "Generic (PLEG): container finished" podID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerID="b1abd27e2c2d388e6a6c1efd167095fea30180701b7db54c428f7e52cebd08ba" exitCode=0 Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.839558 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0","Type":"ContainerDied","Data":"b1abd27e2c2d388e6a6c1efd167095fea30180701b7db54c428f7e52cebd08ba"} Dec 01 10:21:34 crc kubenswrapper[4958]: I1201 10:21:34.872511 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.027217982 podStartE2EDuration="1m25.872475474s" podCreationTimestamp="2025-12-01 10:20:09 +0000 UTC" firstStartedPulling="2025-12-01 10:20:11.685549881 +0000 UTC m=+1259.194338918" lastFinishedPulling="2025-12-01 10:20:59.530807373 +0000 UTC m=+1307.039596410" observedRunningTime="2025-12-01 10:21:34.863817294 +0000 UTC m=+1342.372606361" watchObservedRunningTime="2025-12-01 10:21:34.872475474 +0000 UTC m=+1342.381264521" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.250259 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429216 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-ring-data-devices\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429303 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-swiftconf\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429333 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-scripts\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429386 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-etc-swift\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429427 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk44r\" (UniqueName: \"kubernetes.io/projected/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-kube-api-access-jk44r\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429451 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-dispersionconf\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429523 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-combined-ca-bundle\") pod \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\" (UID: \"6404e65b-16c6-4bd0-a4fe-26ad44d722c6\") " Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.429919 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.430815 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.438765 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.442927 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-kube-api-access-jk44r" (OuterVolumeSpecName: "kube-api-access-jk44r") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "kube-api-access-jk44r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.459109 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.464657 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-scripts" (OuterVolumeSpecName: "scripts") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.464938 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6404e65b-16c6-4bd0-a4fe-26ad44d722c6" (UID: "6404e65b-16c6-4bd0-a4fe-26ad44d722c6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530449 4958 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530513 4958 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530528 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530539 4958 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530552 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk44r\" (UniqueName: \"kubernetes.io/projected/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-kube-api-access-jk44r\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530566 4958 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.530575 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6404e65b-16c6-4bd0-a4fe-26ad44d722c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.851261 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0","Type":"ContainerStarted","Data":"f28e0a79ecd1735ef6ab0d156677914d027c13581eb5f43efb38a35773f7c70d"} Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.851932 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.852988 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jkdpw" event={"ID":"6404e65b-16c6-4bd0-a4fe-26ad44d722c6","Type":"ContainerDied","Data":"4514be35a35e7bac11bf9f97d94ffa13f629236d3c044fb38935dafdda4d53b6"} Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.853038 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jkdpw" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.853047 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4514be35a35e7bac11bf9f97d94ffa13f629236d3c044fb38935dafdda4d53b6" Dec 01 10:21:35 crc kubenswrapper[4958]: I1201 10:21:35.887996 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371947.966818 podStartE2EDuration="1m28.887958105s" podCreationTimestamp="2025-12-01 10:20:07 +0000 UTC" firstStartedPulling="2025-12-01 10:20:11.223196426 +0000 UTC m=+1258.731985463" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:21:35.876900416 +0000 UTC m=+1343.385689453" watchObservedRunningTime="2025-12-01 10:21:35.887958105 +0000 UTC m=+1343.396747152" Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.307422 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.448578 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfc4d\" (UniqueName: \"kubernetes.io/projected/6d2854e0-7961-48f8-adb0-2118d1685a38-kube-api-access-kfc4d\") pod \"6d2854e0-7961-48f8-adb0-2118d1685a38\" (UID: \"6d2854e0-7961-48f8-adb0-2118d1685a38\") " Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.455695 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2854e0-7961-48f8-adb0-2118d1685a38-kube-api-access-kfc4d" (OuterVolumeSpecName: "kube-api-access-kfc4d") pod "6d2854e0-7961-48f8-adb0-2118d1685a38" (UID: "6d2854e0-7961-48f8-adb0-2118d1685a38"). InnerVolumeSpecName "kube-api-access-kfc4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.552014 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfc4d\" (UniqueName: \"kubernetes.io/projected/6d2854e0-7961-48f8-adb0-2118d1685a38-kube-api-access-kfc4d\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.864486 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-99a7-account-create-c9sr8" event={"ID":"6d2854e0-7961-48f8-adb0-2118d1685a38","Type":"ContainerDied","Data":"d4a396ca78178b183b6fe7d1eff41a1222a17ecc477f33aa18a71bcf50398d04"} Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.864542 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a396ca78178b183b6fe7d1eff41a1222a17ecc477f33aa18a71bcf50398d04" Dec 01 10:21:36 crc kubenswrapper[4958]: I1201 10:21:36.864520 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-99a7-account-create-c9sr8" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.728021 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a8f6-account-create-5r7c8"] Dec 01 10:21:38 crc kubenswrapper[4958]: E1201 10:21:38.729002 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6404e65b-16c6-4bd0-a4fe-26ad44d722c6" containerName="swift-ring-rebalance" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.729019 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6404e65b-16c6-4bd0-a4fe-26ad44d722c6" containerName="swift-ring-rebalance" Dec 01 10:21:38 crc kubenswrapper[4958]: E1201 10:21:38.729035 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2854e0-7961-48f8-adb0-2118d1685a38" containerName="mariadb-account-create" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.729041 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2854e0-7961-48f8-adb0-2118d1685a38" containerName="mariadb-account-create" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.729242 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6404e65b-16c6-4bd0-a4fe-26ad44d722c6" containerName="swift-ring-rebalance" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.729259 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2854e0-7961-48f8-adb0-2118d1685a38" containerName="mariadb-account-create" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.730011 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.732306 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.746759 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a8f6-account-create-5r7c8"] Dec 01 10:21:38 crc kubenswrapper[4958]: I1201 10:21:38.897898 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2dm\" (UniqueName: \"kubernetes.io/projected/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692-kube-api-access-dx2dm\") pod \"glance-a8f6-account-create-5r7c8\" (UID: \"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692\") " pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.000000 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2dm\" (UniqueName: \"kubernetes.io/projected/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692-kube-api-access-dx2dm\") pod \"glance-a8f6-account-create-5r7c8\" (UID: \"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692\") " pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.030283 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2dm\" (UniqueName: \"kubernetes.io/projected/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692-kube-api-access-dx2dm\") pod \"glance-a8f6-account-create-5r7c8\" (UID: \"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692\") " pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.055810 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.507387 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a8f6-account-create-5r7c8"] Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.608981 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.613748 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.896919 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3c5a32d-3e78-4ea7-9cb2-ca200f02e692" containerID="dbbea99c45871ffc560c5fecdb1ada0ff900af7e88a825fd2a8b7541c66b703d" exitCode=0 Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.897038 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a8f6-account-create-5r7c8" event={"ID":"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692","Type":"ContainerDied","Data":"dbbea99c45871ffc560c5fecdb1ada0ff900af7e88a825fd2a8b7541c66b703d"} Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.897509 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a8f6-account-create-5r7c8" event={"ID":"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692","Type":"ContainerStarted","Data":"d7037f93e2de18a232ec8ab24338be7e58f0f56bd92ad1fa59df3ff668203950"} Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.983688 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-d9qck-config-tdpkf"] Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.985386 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.988593 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 10:21:39 crc kubenswrapper[4958]: I1201 10:21:39.997417 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-d9qck-config-tdpkf"] Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.030093 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run-ovn\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.031064 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-additional-scripts\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.031216 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-scripts\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.031458 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-log-ovn\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.031689 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.031935 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttn77\" (UniqueName: \"kubernetes.io/projected/08d0af64-a5a3-4192-b697-133e73d9c935-kube-api-access-ttn77\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.133843 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.133949 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttn77\" (UniqueName: \"kubernetes.io/projected/08d0af64-a5a3-4192-b697-133e73d9c935-kube-api-access-ttn77\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134040 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run-ovn\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134071 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-additional-scripts\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134105 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-scripts\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134170 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-log-ovn\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134222 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134308 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-log-ovn\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.134368 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run-ovn\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.135082 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-additional-scripts\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.136605 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-scripts\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.158103 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttn77\" (UniqueName: \"kubernetes.io/projected/08d0af64-a5a3-4192-b697-133e73d9c935-kube-api-access-ttn77\") pod \"ovn-controller-d9qck-config-tdpkf\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.305203 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.621665 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-d9qck-config-tdpkf"] Dec 01 10:21:40 crc kubenswrapper[4958]: I1201 10:21:40.909944 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck-config-tdpkf" event={"ID":"08d0af64-a5a3-4192-b697-133e73d9c935","Type":"ContainerStarted","Data":"adb7d985752e2887e5eb2d7e212b8adbb5f41123490c30493d3f3c0faaf75fc4"} Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.436803 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.507617 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx2dm\" (UniqueName: \"kubernetes.io/projected/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692-kube-api-access-dx2dm\") pod \"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692\" (UID: \"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692\") " Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.516476 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692-kube-api-access-dx2dm" (OuterVolumeSpecName: "kube-api-access-dx2dm") pod "d3c5a32d-3e78-4ea7-9cb2-ca200f02e692" (UID: "d3c5a32d-3e78-4ea7-9cb2-ca200f02e692"). InnerVolumeSpecName "kube-api-access-dx2dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.609943 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx2dm\" (UniqueName: \"kubernetes.io/projected/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692-kube-api-access-dx2dm\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.931882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a8f6-account-create-5r7c8" event={"ID":"d3c5a32d-3e78-4ea7-9cb2-ca200f02e692","Type":"ContainerDied","Data":"d7037f93e2de18a232ec8ab24338be7e58f0f56bd92ad1fa59df3ff668203950"} Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.931943 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7037f93e2de18a232ec8ab24338be7e58f0f56bd92ad1fa59df3ff668203950" Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.932025 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a8f6-account-create-5r7c8" Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.938409 4958 generic.go:334] "Generic (PLEG): container finished" podID="08d0af64-a5a3-4192-b697-133e73d9c935" containerID="430f0f91f844715880d343270142d34691e2b25ebdb7c3450f79ba078ccef680" exitCode=0 Dec 01 10:21:41 crc kubenswrapper[4958]: I1201 10:21:41.938475 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck-config-tdpkf" event={"ID":"08d0af64-a5a3-4192-b697-133e73d9c935","Type":"ContainerDied","Data":"430f0f91f844715880d343270142d34691e2b25ebdb7c3450f79ba078ccef680"} Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.282470 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ea59-account-create-tkkkr"] Dec 01 10:21:43 crc kubenswrapper[4958]: E1201 10:21:43.283361 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c5a32d-3e78-4ea7-9cb2-ca200f02e692" containerName="mariadb-account-create" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.283375 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c5a32d-3e78-4ea7-9cb2-ca200f02e692" containerName="mariadb-account-create" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.283571 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c5a32d-3e78-4ea7-9cb2-ca200f02e692" containerName="mariadb-account-create" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.284272 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.288219 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.299478 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ea59-account-create-tkkkr"] Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.353361 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdbht\" (UniqueName: \"kubernetes.io/projected/814bf37e-29e7-410e-b300-4e1e14ad1d31-kube-api-access-tdbht\") pod \"placement-ea59-account-create-tkkkr\" (UID: \"814bf37e-29e7-410e-b300-4e1e14ad1d31\") " pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.394692 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454584 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run-ovn\") pod \"08d0af64-a5a3-4192-b697-133e73d9c935\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454652 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-scripts\") pod \"08d0af64-a5a3-4192-b697-133e73d9c935\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454794 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run\") pod \"08d0af64-a5a3-4192-b697-133e73d9c935\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454890 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttn77\" (UniqueName: \"kubernetes.io/projected/08d0af64-a5a3-4192-b697-133e73d9c935-kube-api-access-ttn77\") pod \"08d0af64-a5a3-4192-b697-133e73d9c935\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454901 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "08d0af64-a5a3-4192-b697-133e73d9c935" (UID: "08d0af64-a5a3-4192-b697-133e73d9c935"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454928 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-log-ovn\") pod \"08d0af64-a5a3-4192-b697-133e73d9c935\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.454988 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "08d0af64-a5a3-4192-b697-133e73d9c935" (UID: "08d0af64-a5a3-4192-b697-133e73d9c935"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.455114 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-additional-scripts\") pod \"08d0af64-a5a3-4192-b697-133e73d9c935\" (UID: \"08d0af64-a5a3-4192-b697-133e73d9c935\") " Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.455687 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdbht\" (UniqueName: \"kubernetes.io/projected/814bf37e-29e7-410e-b300-4e1e14ad1d31-kube-api-access-tdbht\") pod \"placement-ea59-account-create-tkkkr\" (UID: \"814bf37e-29e7-410e-b300-4e1e14ad1d31\") " pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.456203 4958 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.456217 4958 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.456272 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-scripts" (OuterVolumeSpecName: "scripts") pod "08d0af64-a5a3-4192-b697-133e73d9c935" (UID: "08d0af64-a5a3-4192-b697-133e73d9c935"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.456308 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run" (OuterVolumeSpecName: "var-run") pod "08d0af64-a5a3-4192-b697-133e73d9c935" (UID: "08d0af64-a5a3-4192-b697-133e73d9c935"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.457348 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "08d0af64-a5a3-4192-b697-133e73d9c935" (UID: "08d0af64-a5a3-4192-b697-133e73d9c935"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.464014 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d0af64-a5a3-4192-b697-133e73d9c935-kube-api-access-ttn77" (OuterVolumeSpecName: "kube-api-access-ttn77") pod "08d0af64-a5a3-4192-b697-133e73d9c935" (UID: "08d0af64-a5a3-4192-b697-133e73d9c935"). InnerVolumeSpecName "kube-api-access-ttn77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.475158 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdbht\" (UniqueName: \"kubernetes.io/projected/814bf37e-29e7-410e-b300-4e1e14ad1d31-kube-api-access-tdbht\") pod \"placement-ea59-account-create-tkkkr\" (UID: \"814bf37e-29e7-410e-b300-4e1e14ad1d31\") " pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.558385 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttn77\" (UniqueName: \"kubernetes.io/projected/08d0af64-a5a3-4192-b697-133e73d9c935-kube-api-access-ttn77\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.558457 4958 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.558472 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08d0af64-a5a3-4192-b697-133e73d9c935-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.558491 4958 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/08d0af64-a5a3-4192-b697-133e73d9c935-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.692640 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.971816 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck-config-tdpkf" event={"ID":"08d0af64-a5a3-4192-b697-133e73d9c935","Type":"ContainerDied","Data":"adb7d985752e2887e5eb2d7e212b8adbb5f41123490c30493d3f3c0faaf75fc4"} Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.972269 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adb7d985752e2887e5eb2d7e212b8adbb5f41123490c30493d3f3c0faaf75fc4" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.972291 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ksr7n"] Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.972836 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-tdpkf" Dec 01 10:21:43 crc kubenswrapper[4958]: E1201 10:21:43.974032 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d0af64-a5a3-4192-b697-133e73d9c935" containerName="ovn-config" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.974049 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d0af64-a5a3-4192-b697-133e73d9c935" containerName="ovn-config" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.974316 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d0af64-a5a3-4192-b697-133e73d9c935" containerName="ovn-config" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.975201 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.983343 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ksr7n"] Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.989293 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 10:21:43 crc kubenswrapper[4958]: I1201 10:21:43.989874 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-smrjw" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.070245 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-combined-ca-bundle\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.070352 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wh9\" (UniqueName: \"kubernetes.io/projected/b69312c4-bb8d-4272-bd60-77f355f83f25-kube-api-access-t4wh9\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.070453 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-db-sync-config-data\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.070496 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-config-data\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.172565 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-combined-ca-bundle\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.172657 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wh9\" (UniqueName: \"kubernetes.io/projected/b69312c4-bb8d-4272-bd60-77f355f83f25-kube-api-access-t4wh9\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.172709 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-db-sync-config-data\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.172747 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-config-data\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.178867 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-db-sync-config-data\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.179935 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-config-data\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.182900 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-combined-ca-bundle\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.193818 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wh9\" (UniqueName: \"kubernetes.io/projected/b69312c4-bb8d-4272-bd60-77f355f83f25-kube-api-access-t4wh9\") pod \"glance-db-sync-ksr7n\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.246394 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ea59-account-create-tkkkr"] Dec 01 10:21:44 crc kubenswrapper[4958]: W1201 10:21:44.248035 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814bf37e_29e7_410e_b300_4e1e14ad1d31.slice/crio-0d5e27f96e3e3b6564fbada36d173ac2e4141487af35d26bb7449dbb30fb0aa2 WatchSource:0}: Error finding container 0d5e27f96e3e3b6564fbada36d173ac2e4141487af35d26bb7449dbb30fb0aa2: Status 404 returned error can't find the container with id 0d5e27f96e3e3b6564fbada36d173ac2e4141487af35d26bb7449dbb30fb0aa2 Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.313371 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksr7n" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.427805 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-d9qck" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.592383 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-d9qck-config-tdpkf"] Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.613817 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-d9qck-config-tdpkf"] Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.694643 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-d9qck-config-z5vkn"] Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.696513 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.700820 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.714435 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-d9qck-config-z5vkn"] Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.790439 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-scripts\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.790523 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-additional-scripts\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.790547 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtlww\" (UniqueName: \"kubernetes.io/projected/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-kube-api-access-qtlww\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.790579 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.790640 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run-ovn\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.790755 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-log-ovn\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.892532 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-log-ovn\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.892631 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-scripts\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.892699 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-additional-scripts\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.892730 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtlww\" (UniqueName: \"kubernetes.io/projected/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-kube-api-access-qtlww\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.892813 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.893035 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-log-ovn\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.893092 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.893466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run-ovn\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.893632 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run-ovn\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.894005 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-additional-scripts\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.895634 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-scripts\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.937946 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtlww\" (UniqueName: \"kubernetes.io/projected/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-kube-api-access-qtlww\") pod \"ovn-controller-d9qck-config-z5vkn\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.985088 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ea59-account-create-tkkkr" event={"ID":"814bf37e-29e7-410e-b300-4e1e14ad1d31","Type":"ContainerDied","Data":"d934e39a11b02d3b7b7d7d1ff2ed4bef1dea9bdfc1beaa6858543bf4d3d6b1f1"} Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.985051 4958 generic.go:334] "Generic (PLEG): container finished" podID="814bf37e-29e7-410e-b300-4e1e14ad1d31" containerID="d934e39a11b02d3b7b7d7d1ff2ed4bef1dea9bdfc1beaa6858543bf4d3d6b1f1" exitCode=0 Dec 01 10:21:44 crc kubenswrapper[4958]: I1201 10:21:44.985208 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ea59-account-create-tkkkr" event={"ID":"814bf37e-29e7-410e-b300-4e1e14ad1d31","Type":"ContainerStarted","Data":"0d5e27f96e3e3b6564fbada36d173ac2e4141487af35d26bb7449dbb30fb0aa2"} Dec 01 10:21:45 crc kubenswrapper[4958]: I1201 10:21:45.093893 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ksr7n"] Dec 01 10:21:45 crc kubenswrapper[4958]: W1201 10:21:45.097631 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb69312c4_bb8d_4272_bd60_77f355f83f25.slice/crio-eebc851ba7683e27e014381221709d9a02fafb7903b574d7fd16564705acebda WatchSource:0}: Error finding container eebc851ba7683e27e014381221709d9a02fafb7903b574d7fd16564705acebda: Status 404 returned error can't find the container with id eebc851ba7683e27e014381221709d9a02fafb7903b574d7fd16564705acebda Dec 01 10:21:45 crc kubenswrapper[4958]: I1201 10:21:45.099485 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:45 crc kubenswrapper[4958]: I1201 10:21:45.612514 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-d9qck-config-z5vkn"] Dec 01 10:21:45 crc kubenswrapper[4958]: W1201 10:21:45.648045 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497cbf2f_90f9_4fd5_a3c9_8b8105ec0bb8.slice/crio-8fc109e0a0e1b2afe4744aa435449f8bf131e8503d67ca578a9db5ac8e61eddc WatchSource:0}: Error finding container 8fc109e0a0e1b2afe4744aa435449f8bf131e8503d67ca578a9db5ac8e61eddc: Status 404 returned error can't find the container with id 8fc109e0a0e1b2afe4744aa435449f8bf131e8503d67ca578a9db5ac8e61eddc Dec 01 10:21:45 crc kubenswrapper[4958]: I1201 10:21:45.813227 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d0af64-a5a3-4192-b697-133e73d9c935" path="/var/lib/kubelet/pods/08d0af64-a5a3-4192-b697-133e73d9c935/volumes" Dec 01 10:21:46 crc kubenswrapper[4958]: I1201 10:21:46.008823 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck-config-z5vkn" event={"ID":"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8","Type":"ContainerStarted","Data":"c0bfa711eee72c17bb35be6e2fb314864b58daa10417660b9896ddb1a68d6efb"} Dec 01 10:21:46 crc kubenswrapper[4958]: I1201 10:21:46.009179 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck-config-z5vkn" event={"ID":"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8","Type":"ContainerStarted","Data":"8fc109e0a0e1b2afe4744aa435449f8bf131e8503d67ca578a9db5ac8e61eddc"} Dec 01 10:21:46 crc kubenswrapper[4958]: I1201 10:21:46.011095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksr7n" event={"ID":"b69312c4-bb8d-4272-bd60-77f355f83f25","Type":"ContainerStarted","Data":"eebc851ba7683e27e014381221709d9a02fafb7903b574d7fd16564705acebda"} Dec 01 10:21:46 crc kubenswrapper[4958]: I1201 10:21:46.049820 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-d9qck-config-z5vkn" podStartSLOduration=2.049782523 podStartE2EDuration="2.049782523s" podCreationTimestamp="2025-12-01 10:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:21:46.038709513 +0000 UTC m=+1353.547498550" watchObservedRunningTime="2025-12-01 10:21:46.049782523 +0000 UTC m=+1353.558571570" Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:46.387599 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:46.440504 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdbht\" (UniqueName: \"kubernetes.io/projected/814bf37e-29e7-410e-b300-4e1e14ad1d31-kube-api-access-tdbht\") pod \"814bf37e-29e7-410e-b300-4e1e14ad1d31\" (UID: \"814bf37e-29e7-410e-b300-4e1e14ad1d31\") " Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:46.451314 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814bf37e-29e7-410e-b300-4e1e14ad1d31-kube-api-access-tdbht" (OuterVolumeSpecName: "kube-api-access-tdbht") pod "814bf37e-29e7-410e-b300-4e1e14ad1d31" (UID: "814bf37e-29e7-410e-b300-4e1e14ad1d31"). InnerVolumeSpecName "kube-api-access-tdbht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:46.543197 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdbht\" (UniqueName: \"kubernetes.io/projected/814bf37e-29e7-410e-b300-4e1e14ad1d31-kube-api-access-tdbht\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:47.025011 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea59-account-create-tkkkr" Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:47.024983 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ea59-account-create-tkkkr" event={"ID":"814bf37e-29e7-410e-b300-4e1e14ad1d31","Type":"ContainerDied","Data":"0d5e27f96e3e3b6564fbada36d173ac2e4141487af35d26bb7449dbb30fb0aa2"} Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:47.025705 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d5e27f96e3e3b6564fbada36d173ac2e4141487af35d26bb7449dbb30fb0aa2" Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:47.027918 4958 generic.go:334] "Generic (PLEG): container finished" podID="497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" containerID="c0bfa711eee72c17bb35be6e2fb314864b58daa10417660b9896ddb1a68d6efb" exitCode=0 Dec 01 10:21:47 crc kubenswrapper[4958]: I1201 10:21:47.027992 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck-config-z5vkn" event={"ID":"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8","Type":"ContainerDied","Data":"c0bfa711eee72c17bb35be6e2fb314864b58daa10417660b9896ddb1a68d6efb"} Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.367065 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.388365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.401165 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"swift-storage-0\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " pod="openstack/swift-storage-0" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.489807 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run\") pod \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.489974 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-log-ovn\") pod \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490056 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-scripts\") pod \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490069 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run" (OuterVolumeSpecName: "var-run") pod "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" (UID: "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490137 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-additional-scripts\") pod \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490200 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtlww\" (UniqueName: \"kubernetes.io/projected/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-kube-api-access-qtlww\") pod \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490170 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" (UID: "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490300 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" (UID: "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.490237 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run-ovn\") pod \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\" (UID: \"497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8\") " Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.491564 4958 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.491592 4958 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.491604 4958 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.491596 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" (UID: "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.493094 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-scripts" (OuterVolumeSpecName: "scripts") pod "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" (UID: "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.504252 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-kube-api-access-qtlww" (OuterVolumeSpecName: "kube-api-access-qtlww") pod "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" (UID: "497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8"). InnerVolumeSpecName "kube-api-access-qtlww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.598532 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.598594 4958 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.598614 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtlww\" (UniqueName: \"kubernetes.io/projected/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8-kube-api-access-qtlww\") on node \"crc\" DevicePath \"\"" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.629829 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.718577 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-d9qck-config-z5vkn"] Dec 01 10:21:48 crc kubenswrapper[4958]: I1201 10:21:48.730486 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-d9qck-config-z5vkn"] Dec 01 10:21:49 crc kubenswrapper[4958]: I1201 10:21:49.075446 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fc109e0a0e1b2afe4744aa435449f8bf131e8503d67ca578a9db5ac8e61eddc" Dec 01 10:21:49 crc kubenswrapper[4958]: I1201 10:21:49.075559 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck-config-z5vkn" Dec 01 10:21:49 crc kubenswrapper[4958]: I1201 10:21:49.222038 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 10:21:49 crc kubenswrapper[4958]: I1201 10:21:49.812573 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" path="/var/lib/kubelet/pods/497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8/volumes" Dec 01 10:21:49 crc kubenswrapper[4958]: I1201 10:21:49.987054 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:21:50 crc kubenswrapper[4958]: I1201 10:21:50.097562 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"1c1b40356587456bb51e3ad44e98d58f98fcfc84e252bf116e2742fafa3e7267"} Dec 01 10:21:50 crc kubenswrapper[4958]: I1201 10:21:50.750150 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 10:21:51 crc kubenswrapper[4958]: I1201 10:21:51.110425 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef"} Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.124562 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3"} Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.124915 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b"} Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.593216 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6xc7n"] Dec 01 10:21:52 crc kubenswrapper[4958]: E1201 10:21:52.594101 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" containerName="ovn-config" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.594127 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" containerName="ovn-config" Dec 01 10:21:52 crc kubenswrapper[4958]: E1201 10:21:52.594147 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814bf37e-29e7-410e-b300-4e1e14ad1d31" containerName="mariadb-account-create" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.594155 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="814bf37e-29e7-410e-b300-4e1e14ad1d31" containerName="mariadb-account-create" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.594354 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="497cbf2f-90f9-4fd5-a3c9-8b8105ec0bb8" containerName="ovn-config" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.594382 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="814bf37e-29e7-410e-b300-4e1e14ad1d31" containerName="mariadb-account-create" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.595951 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6xc7n" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.619739 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6xc7n"] Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.702513 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5zq\" (UniqueName: \"kubernetes.io/projected/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82-kube-api-access-5j5zq\") pod \"barbican-db-create-6xc7n\" (UID: \"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82\") " pod="openstack/barbican-db-create-6xc7n" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.813353 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5zq\" (UniqueName: \"kubernetes.io/projected/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82-kube-api-access-5j5zq\") pod \"barbican-db-create-6xc7n\" (UID: \"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82\") " pod="openstack/barbican-db-create-6xc7n" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.838784 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sj694"] Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.840200 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sj694" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.857458 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5zq\" (UniqueName: \"kubernetes.io/projected/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82-kube-api-access-5j5zq\") pod \"barbican-db-create-6xc7n\" (UID: \"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82\") " pod="openstack/barbican-db-create-6xc7n" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.873756 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sj694"] Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.942772 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6xc7n" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.943223 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9sz8p"] Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.944419 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9sz8p" Dec 01 10:21:52 crc kubenswrapper[4958]: I1201 10:21:52.969759 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9sz8p"] Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.017007 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4fw\" (UniqueName: \"kubernetes.io/projected/de2699dc-0b5f-40c4-a163-d73e30013069-kube-api-access-jh4fw\") pod \"cinder-db-create-sj694\" (UID: \"de2699dc-0b5f-40c4-a163-d73e30013069\") " pod="openstack/cinder-db-create-sj694" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.026688 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-q4ccp"] Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.028149 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.037503 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.037756 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.038413 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.042332 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62dpp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.047776 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q4ccp"] Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.128181 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vfl\" (UniqueName: \"kubernetes.io/projected/de9d3333-6e5c-4663-9b79-d8fd9a9974a8-kube-api-access-n7vfl\") pod \"neutron-db-create-9sz8p\" (UID: \"de9d3333-6e5c-4663-9b79-d8fd9a9974a8\") " pod="openstack/neutron-db-create-9sz8p" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.128249 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv5nt\" (UniqueName: \"kubernetes.io/projected/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-kube-api-access-bv5nt\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.128367 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4fw\" (UniqueName: \"kubernetes.io/projected/de2699dc-0b5f-40c4-a163-d73e30013069-kube-api-access-jh4fw\") pod \"cinder-db-create-sj694\" (UID: \"de2699dc-0b5f-40c4-a163-d73e30013069\") " pod="openstack/cinder-db-create-sj694" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.128419 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-combined-ca-bundle\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.128476 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-config-data\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.140451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0"} Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.185695 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4fw\" (UniqueName: \"kubernetes.io/projected/de2699dc-0b5f-40c4-a163-d73e30013069-kube-api-access-jh4fw\") pod \"cinder-db-create-sj694\" (UID: \"de2699dc-0b5f-40c4-a163-d73e30013069\") " pod="openstack/cinder-db-create-sj694" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.220392 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sj694" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.230159 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-combined-ca-bundle\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.230247 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-config-data\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.230282 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vfl\" (UniqueName: \"kubernetes.io/projected/de9d3333-6e5c-4663-9b79-d8fd9a9974a8-kube-api-access-n7vfl\") pod \"neutron-db-create-9sz8p\" (UID: \"de9d3333-6e5c-4663-9b79-d8fd9a9974a8\") " pod="openstack/neutron-db-create-9sz8p" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.230299 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv5nt\" (UniqueName: \"kubernetes.io/projected/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-kube-api-access-bv5nt\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.236620 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-combined-ca-bundle\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.237157 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-config-data\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.253150 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vfl\" (UniqueName: \"kubernetes.io/projected/de9d3333-6e5c-4663-9b79-d8fd9a9974a8-kube-api-access-n7vfl\") pod \"neutron-db-create-9sz8p\" (UID: \"de9d3333-6e5c-4663-9b79-d8fd9a9974a8\") " pod="openstack/neutron-db-create-9sz8p" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.255977 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv5nt\" (UniqueName: \"kubernetes.io/projected/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-kube-api-access-bv5nt\") pod \"keystone-db-sync-q4ccp\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.275233 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9sz8p" Dec 01 10:21:53 crc kubenswrapper[4958]: I1201 10:21:53.360726 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:21:58 crc kubenswrapper[4958]: I1201 10:21:58.210216 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:21:58 crc kubenswrapper[4958]: I1201 10:21:58.211154 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:21:58 crc kubenswrapper[4958]: I1201 10:21:58.211258 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:21:58 crc kubenswrapper[4958]: I1201 10:21:58.212401 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f09314f73af3b199ee4f78ab6cf71768969c699a4967f9d91c7d9dc73162183f"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:21:58 crc kubenswrapper[4958]: I1201 10:21:58.212571 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://f09314f73af3b199ee4f78ab6cf71768969c699a4967f9d91c7d9dc73162183f" gracePeriod=600 Dec 01 10:21:59 crc kubenswrapper[4958]: I1201 10:21:59.209713 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="f09314f73af3b199ee4f78ab6cf71768969c699a4967f9d91c7d9dc73162183f" exitCode=0 Dec 01 10:21:59 crc kubenswrapper[4958]: I1201 10:21:59.209801 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"f09314f73af3b199ee4f78ab6cf71768969c699a4967f9d91c7d9dc73162183f"} Dec 01 10:21:59 crc kubenswrapper[4958]: I1201 10:21:59.209897 4958 scope.go:117] "RemoveContainer" containerID="269d028bc69c92127b6b1ad5b3c8a371b2530bdef6d634e5295d699f9be45b99" Dec 01 10:22:01 crc kubenswrapper[4958]: E1201 10:22:01.527126 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 01 10:22:01 crc kubenswrapper[4958]: E1201 10:22:01.527910 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4wh9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-ksr7n_openstack(b69312c4-bb8d-4272-bd60-77f355f83f25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:22:01 crc kubenswrapper[4958]: E1201 10:22:01.529448 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-ksr7n" podUID="b69312c4-bb8d-4272-bd60-77f355f83f25" Dec 01 10:22:02 crc kubenswrapper[4958]: I1201 10:22:02.145183 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sj694"] Dec 01 10:22:02 crc kubenswrapper[4958]: I1201 10:22:02.232411 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q4ccp"] Dec 01 10:22:02 crc kubenswrapper[4958]: I1201 10:22:02.243114 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9sz8p"] Dec 01 10:22:02 crc kubenswrapper[4958]: I1201 10:22:02.243736 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4"} Dec 01 10:22:02 crc kubenswrapper[4958]: E1201 10:22:02.245721 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-ksr7n" podUID="b69312c4-bb8d-4272-bd60-77f355f83f25" Dec 01 10:22:02 crc kubenswrapper[4958]: I1201 10:22:02.346316 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6xc7n"] Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.256487 4958 generic.go:334] "Generic (PLEG): container finished" podID="de9d3333-6e5c-4663-9b79-d8fd9a9974a8" containerID="a093c09932dcf809929bd91f191d41a237cb98ecb47ecd5032d496a5cc147ca2" exitCode=0 Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.256586 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9sz8p" event={"ID":"de9d3333-6e5c-4663-9b79-d8fd9a9974a8","Type":"ContainerDied","Data":"a093c09932dcf809929bd91f191d41a237cb98ecb47ecd5032d496a5cc147ca2"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.257260 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9sz8p" event={"ID":"de9d3333-6e5c-4663-9b79-d8fd9a9974a8","Type":"ContainerStarted","Data":"f3e7e40a995b2ec99c679129db742c6c1cf00e8b1520ecee8f6358c41cd30999"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.259161 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q4ccp" event={"ID":"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f","Type":"ContainerStarted","Data":"97117379b771560831cfe8053256b40a3f6480f048c1b89fc5970425ee6a756f"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.261563 4958 generic.go:334] "Generic (PLEG): container finished" podID="de2699dc-0b5f-40c4-a163-d73e30013069" containerID="836428302d34363977d7a765ecb4dde51fd074295bc44cfb1a06531999e062bf" exitCode=0 Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.261672 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sj694" event={"ID":"de2699dc-0b5f-40c4-a163-d73e30013069","Type":"ContainerDied","Data":"836428302d34363977d7a765ecb4dde51fd074295bc44cfb1a06531999e062bf"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.261706 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sj694" event={"ID":"de2699dc-0b5f-40c4-a163-d73e30013069","Type":"ContainerStarted","Data":"3e0f39414c8ecfb71a9040fffe8279f5bfdef9329e3b819da4782700c946bce2"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.266684 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.271486 4958 generic.go:334] "Generic (PLEG): container finished" podID="ed6ba24c-b48f-4fac-b7e1-8acd44d44c82" containerID="31b1419cc86dd7611c8a535d42a0b25a61d4f290db0f3338f6d6fe69be403ae5" exitCode=0 Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.271605 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6xc7n" event={"ID":"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82","Type":"ContainerDied","Data":"31b1419cc86dd7611c8a535d42a0b25a61d4f290db0f3338f6d6fe69be403ae5"} Dec 01 10:22:03 crc kubenswrapper[4958]: I1201 10:22:03.271656 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6xc7n" event={"ID":"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82","Type":"ContainerStarted","Data":"7f19471dc18d7baf196d2bf4d64c5bb2fb998ffdda61a28aa9329b1f4d58e0b6"} Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.296092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e"} Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.299725 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d"} Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.299823 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789"} Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.683726 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6xc7n" Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.774472 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9sz8p" Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.784001 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sj694" Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.853194 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j5zq\" (UniqueName: \"kubernetes.io/projected/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82-kube-api-access-5j5zq\") pod \"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82\" (UID: \"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82\") " Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.871466 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82-kube-api-access-5j5zq" (OuterVolumeSpecName: "kube-api-access-5j5zq") pod "ed6ba24c-b48f-4fac-b7e1-8acd44d44c82" (UID: "ed6ba24c-b48f-4fac-b7e1-8acd44d44c82"). InnerVolumeSpecName "kube-api-access-5j5zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.955864 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4fw\" (UniqueName: \"kubernetes.io/projected/de2699dc-0b5f-40c4-a163-d73e30013069-kube-api-access-jh4fw\") pod \"de2699dc-0b5f-40c4-a163-d73e30013069\" (UID: \"de2699dc-0b5f-40c4-a163-d73e30013069\") " Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.956100 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7vfl\" (UniqueName: \"kubernetes.io/projected/de9d3333-6e5c-4663-9b79-d8fd9a9974a8-kube-api-access-n7vfl\") pod \"de9d3333-6e5c-4663-9b79-d8fd9a9974a8\" (UID: \"de9d3333-6e5c-4663-9b79-d8fd9a9974a8\") " Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.956624 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j5zq\" (UniqueName: \"kubernetes.io/projected/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82-kube-api-access-5j5zq\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.960267 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2699dc-0b5f-40c4-a163-d73e30013069-kube-api-access-jh4fw" (OuterVolumeSpecName: "kube-api-access-jh4fw") pod "de2699dc-0b5f-40c4-a163-d73e30013069" (UID: "de2699dc-0b5f-40c4-a163-d73e30013069"). InnerVolumeSpecName "kube-api-access-jh4fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:04 crc kubenswrapper[4958]: I1201 10:22:04.960798 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9d3333-6e5c-4663-9b79-d8fd9a9974a8-kube-api-access-n7vfl" (OuterVolumeSpecName: "kube-api-access-n7vfl") pod "de9d3333-6e5c-4663-9b79-d8fd9a9974a8" (UID: "de9d3333-6e5c-4663-9b79-d8fd9a9974a8"). InnerVolumeSpecName "kube-api-access-n7vfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.058797 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4fw\" (UniqueName: \"kubernetes.io/projected/de2699dc-0b5f-40c4-a163-d73e30013069-kube-api-access-jh4fw\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.059295 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7vfl\" (UniqueName: \"kubernetes.io/projected/de9d3333-6e5c-4663-9b79-d8fd9a9974a8-kube-api-access-n7vfl\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.307513 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6xc7n" event={"ID":"ed6ba24c-b48f-4fac-b7e1-8acd44d44c82","Type":"ContainerDied","Data":"7f19471dc18d7baf196d2bf4d64c5bb2fb998ffdda61a28aa9329b1f4d58e0b6"} Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.307582 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f19471dc18d7baf196d2bf4d64c5bb2fb998ffdda61a28aa9329b1f4d58e0b6" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.307537 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6xc7n" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.310213 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9sz8p" event={"ID":"de9d3333-6e5c-4663-9b79-d8fd9a9974a8","Type":"ContainerDied","Data":"f3e7e40a995b2ec99c679129db742c6c1cf00e8b1520ecee8f6358c41cd30999"} Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.310275 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e7e40a995b2ec99c679129db742c6c1cf00e8b1520ecee8f6358c41cd30999" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.310366 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9sz8p" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.333653 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sj694" event={"ID":"de2699dc-0b5f-40c4-a163-d73e30013069","Type":"ContainerDied","Data":"3e0f39414c8ecfb71a9040fffe8279f5bfdef9329e3b819da4782700c946bce2"} Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.335606 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e0f39414c8ecfb71a9040fffe8279f5bfdef9329e3b819da4782700c946bce2" Dec 01 10:22:05 crc kubenswrapper[4958]: I1201 10:22:05.333990 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sj694" Dec 01 10:22:10 crc kubenswrapper[4958]: I1201 10:22:10.928269 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q4ccp" event={"ID":"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f","Type":"ContainerStarted","Data":"2c5862a636d2560b2ccb30504701cdf2c56d0b52932f967751e00af5eddbd64b"} Dec 01 10:22:10 crc kubenswrapper[4958]: I1201 10:22:10.942438 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84"} Dec 01 10:22:10 crc kubenswrapper[4958]: I1201 10:22:10.942552 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b"} Dec 01 10:22:10 crc kubenswrapper[4958]: I1201 10:22:10.942585 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457"} Dec 01 10:22:10 crc kubenswrapper[4958]: I1201 10:22:10.966833 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-q4ccp" podStartSLOduration=11.451509165 podStartE2EDuration="18.96680359s" podCreationTimestamp="2025-12-01 10:21:52 +0000 UTC" firstStartedPulling="2025-12-01 10:22:02.440092189 +0000 UTC m=+1369.948881236" lastFinishedPulling="2025-12-01 10:22:09.955386624 +0000 UTC m=+1377.464175661" observedRunningTime="2025-12-01 10:22:10.95742667 +0000 UTC m=+1378.466215717" watchObservedRunningTime="2025-12-01 10:22:10.96680359 +0000 UTC m=+1378.475592637" Dec 01 10:22:11 crc kubenswrapper[4958]: I1201 10:22:11.973513 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a"} Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567020 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5432-account-create-2x7ph"] Dec 01 10:22:12 crc kubenswrapper[4958]: E1201 10:22:12.567459 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9d3333-6e5c-4663-9b79-d8fd9a9974a8" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567479 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9d3333-6e5c-4663-9b79-d8fd9a9974a8" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: E1201 10:22:12.567517 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6ba24c-b48f-4fac-b7e1-8acd44d44c82" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567524 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6ba24c-b48f-4fac-b7e1-8acd44d44c82" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: E1201 10:22:12.567540 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2699dc-0b5f-40c4-a163-d73e30013069" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567547 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2699dc-0b5f-40c4-a163-d73e30013069" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567819 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6ba24c-b48f-4fac-b7e1-8acd44d44c82" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567858 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="de9d3333-6e5c-4663-9b79-d8fd9a9974a8" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.567881 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2699dc-0b5f-40c4-a163-d73e30013069" containerName="mariadb-database-create" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.568536 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.570574 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.585759 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5432-account-create-2x7ph"] Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.736952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7hks\" (UniqueName: \"kubernetes.io/projected/13f40860-6a7c-4bb0-83a3-6f61c6da561c-kube-api-access-f7hks\") pod \"barbican-5432-account-create-2x7ph\" (UID: \"13f40860-6a7c-4bb0-83a3-6f61c6da561c\") " pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.789457 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-259f-account-create-fmfb6"] Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.791103 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.796008 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.825680 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-259f-account-create-fmfb6"] Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.840516 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7hks\" (UniqueName: \"kubernetes.io/projected/13f40860-6a7c-4bb0-83a3-6f61c6da561c-kube-api-access-f7hks\") pod \"barbican-5432-account-create-2x7ph\" (UID: \"13f40860-6a7c-4bb0-83a3-6f61c6da561c\") " pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.864148 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7hks\" (UniqueName: \"kubernetes.io/projected/13f40860-6a7c-4bb0-83a3-6f61c6da561c-kube-api-access-f7hks\") pod \"barbican-5432-account-create-2x7ph\" (UID: \"13f40860-6a7c-4bb0-83a3-6f61c6da561c\") " pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.902706 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.949920 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbvz\" (UniqueName: \"kubernetes.io/projected/f4231276-2ab4-4cfd-bb47-850bf407d477-kube-api-access-ntbvz\") pod \"cinder-259f-account-create-fmfb6\" (UID: \"f4231276-2ab4-4cfd-bb47-850bf407d477\") " pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:12 crc kubenswrapper[4958]: I1201 10:22:12.985258 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a095-account-create-pkl5w"] Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:12.994710 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:12.995358 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b"} Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:12.995953 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a095-account-create-pkl5w"] Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:12.997534 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.052000 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbvz\" (UniqueName: \"kubernetes.io/projected/f4231276-2ab4-4cfd-bb47-850bf407d477-kube-api-access-ntbvz\") pod \"cinder-259f-account-create-fmfb6\" (UID: \"f4231276-2ab4-4cfd-bb47-850bf407d477\") " pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.202800 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbvz\" (UniqueName: \"kubernetes.io/projected/f4231276-2ab4-4cfd-bb47-850bf407d477-kube-api-access-ntbvz\") pod \"cinder-259f-account-create-fmfb6\" (UID: \"f4231276-2ab4-4cfd-bb47-850bf407d477\") " pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.203930 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6bcf\" (UniqueName: \"kubernetes.io/projected/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049-kube-api-access-c6bcf\") pod \"neutron-a095-account-create-pkl5w\" (UID: \"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049\") " pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.321076 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6bcf\" (UniqueName: \"kubernetes.io/projected/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049-kube-api-access-c6bcf\") pod \"neutron-a095-account-create-pkl5w\" (UID: \"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049\") " pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.357742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6bcf\" (UniqueName: \"kubernetes.io/projected/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049-kube-api-access-c6bcf\") pod \"neutron-a095-account-create-pkl5w\" (UID: \"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049\") " pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.421342 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.554486 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5432-account-create-2x7ph"] Dec 01 10:22:13 crc kubenswrapper[4958]: W1201 10:22:13.565420 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f40860_6a7c_4bb0_83a3_6f61c6da561c.slice/crio-aff67ca35ede7b2dbc88b693cb6d5462ed1884b592ce6f9767609de7af30f6f6 WatchSource:0}: Error finding container aff67ca35ede7b2dbc88b693cb6d5462ed1884b592ce6f9767609de7af30f6f6: Status 404 returned error can't find the container with id aff67ca35ede7b2dbc88b693cb6d5462ed1884b592ce6f9767609de7af30f6f6 Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.625723 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.916213 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-259f-account-create-fmfb6"] Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.924774 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a095-account-create-pkl5w"] Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.926528 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 10:22:13 crc kubenswrapper[4958]: W1201 10:22:13.936655 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3fa6e71_f3b7_4b98_a760_10a4ae7ed049.slice/crio-dea85770b58c15a0e1aac7f437ecfde5dd7182b9f8573f7dd6a766d5b51e23c3 WatchSource:0}: Error finding container dea85770b58c15a0e1aac7f437ecfde5dd7182b9f8573f7dd6a766d5b51e23c3: Status 404 returned error can't find the container with id dea85770b58c15a0e1aac7f437ecfde5dd7182b9f8573f7dd6a766d5b51e23c3 Dec 01 10:22:13 crc kubenswrapper[4958]: I1201 10:22:13.960517 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.013878 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a095-account-create-pkl5w" event={"ID":"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049","Type":"ContainerStarted","Data":"dea85770b58c15a0e1aac7f437ecfde5dd7182b9f8573f7dd6a766d5b51e23c3"} Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.016706 4958 generic.go:334] "Generic (PLEG): container finished" podID="13f40860-6a7c-4bb0-83a3-6f61c6da561c" containerID="7da282837f15c5c330a6de482a830c5d5983b27e8b8422d3b3404e3df5d4677c" exitCode=0 Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.016804 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5432-account-create-2x7ph" event={"ID":"13f40860-6a7c-4bb0-83a3-6f61c6da561c","Type":"ContainerDied","Data":"7da282837f15c5c330a6de482a830c5d5983b27e8b8422d3b3404e3df5d4677c"} Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.016830 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5432-account-create-2x7ph" event={"ID":"13f40860-6a7c-4bb0-83a3-6f61c6da561c","Type":"ContainerStarted","Data":"aff67ca35ede7b2dbc88b693cb6d5462ed1884b592ce6f9767609de7af30f6f6"} Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.036506 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573"} Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.036570 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerStarted","Data":"f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609"} Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.038650 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-259f-account-create-fmfb6" event={"ID":"f4231276-2ab4-4cfd-bb47-850bf407d477","Type":"ContainerStarted","Data":"b5358ed4a68eb4b11c38f5ad48cbe9949ccdbbcd9773cea377b83b4831001e4d"} Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.081532 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.359435547 podStartE2EDuration="59.081506181s" podCreationTimestamp="2025-12-01 10:21:15 +0000 UTC" firstStartedPulling="2025-12-01 10:21:49.223637022 +0000 UTC m=+1356.732426049" lastFinishedPulling="2025-12-01 10:22:09.945707636 +0000 UTC m=+1377.454496683" observedRunningTime="2025-12-01 10:22:14.07349744 +0000 UTC m=+1381.582286477" watchObservedRunningTime="2025-12-01 10:22:14.081506181 +0000 UTC m=+1381.590295218" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.413556 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vnkgp"] Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.602910 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.620307 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.621268 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.621519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-svc\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.621648 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.621784 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-config\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.641147 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9xlt\" (UniqueName: \"kubernetes.io/projected/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-kube-api-access-x9xlt\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.620450 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.673531 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vnkgp"] Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.743488 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.744277 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-svc\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.744490 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.744569 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-config\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.744927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9xlt\" (UniqueName: \"kubernetes.io/projected/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-kube-api-access-x9xlt\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.745176 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.745338 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.745578 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-svc\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.746084 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.746675 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.747168 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-config\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:14 crc kubenswrapper[4958]: I1201 10:22:14.774069 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9xlt\" (UniqueName: \"kubernetes.io/projected/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-kube-api-access-x9xlt\") pod \"dnsmasq-dns-764c5664d7-vnkgp\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.000227 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.051459 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4231276-2ab4-4cfd-bb47-850bf407d477" containerID="763193f6deea6c4c07d81eb81043b214061144fe250629bb1739bcf78abea07b" exitCode=0 Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.051603 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-259f-account-create-fmfb6" event={"ID":"f4231276-2ab4-4cfd-bb47-850bf407d477","Type":"ContainerDied","Data":"763193f6deea6c4c07d81eb81043b214061144fe250629bb1739bcf78abea07b"} Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.057123 4958 generic.go:334] "Generic (PLEG): container finished" podID="e3fa6e71-f3b7-4b98-a760-10a4ae7ed049" containerID="70bb5873249a6ef3c7a1f7875843e58f1ab41d3b195ac1f99ab710c1efd9105f" exitCode=0 Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.057230 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a095-account-create-pkl5w" event={"ID":"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049","Type":"ContainerDied","Data":"70bb5873249a6ef3c7a1f7875843e58f1ab41d3b195ac1f99ab710c1efd9105f"} Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.065164 4958 generic.go:334] "Generic (PLEG): container finished" podID="f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" containerID="2c5862a636d2560b2ccb30504701cdf2c56d0b52932f967751e00af5eddbd64b" exitCode=0 Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.065574 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q4ccp" event={"ID":"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f","Type":"ContainerDied","Data":"2c5862a636d2560b2ccb30504701cdf2c56d0b52932f967751e00af5eddbd64b"} Dec 01 10:22:15 crc kubenswrapper[4958]: W1201 10:22:15.519097 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod358d4d03_da3c_44c6_a4a4_32f6146fa1d9.slice/crio-0793517bb2464dda463ba258f06057e5eb04140cd31fddb02f569cae0ddc4fdb WatchSource:0}: Error finding container 0793517bb2464dda463ba258f06057e5eb04140cd31fddb02f569cae0ddc4fdb: Status 404 returned error can't find the container with id 0793517bb2464dda463ba258f06057e5eb04140cd31fddb02f569cae0ddc4fdb Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.521368 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vnkgp"] Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.529306 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.563747 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7hks\" (UniqueName: \"kubernetes.io/projected/13f40860-6a7c-4bb0-83a3-6f61c6da561c-kube-api-access-f7hks\") pod \"13f40860-6a7c-4bb0-83a3-6f61c6da561c\" (UID: \"13f40860-6a7c-4bb0-83a3-6f61c6da561c\") " Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.569608 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f40860-6a7c-4bb0-83a3-6f61c6da561c-kube-api-access-f7hks" (OuterVolumeSpecName: "kube-api-access-f7hks") pod "13f40860-6a7c-4bb0-83a3-6f61c6da561c" (UID: "13f40860-6a7c-4bb0-83a3-6f61c6da561c"). InnerVolumeSpecName "kube-api-access-f7hks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:15 crc kubenswrapper[4958]: I1201 10:22:15.666408 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7hks\" (UniqueName: \"kubernetes.io/projected/13f40860-6a7c-4bb0-83a3-6f61c6da561c-kube-api-access-f7hks\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.079161 4958 generic.go:334] "Generic (PLEG): container finished" podID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerID="11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6" exitCode=0 Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.079273 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" event={"ID":"358d4d03-da3c-44c6-a4a4-32f6146fa1d9","Type":"ContainerDied","Data":"11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6"} Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.079704 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" event={"ID":"358d4d03-da3c-44c6-a4a4-32f6146fa1d9","Type":"ContainerStarted","Data":"0793517bb2464dda463ba258f06057e5eb04140cd31fddb02f569cae0ddc4fdb"} Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.083182 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5432-account-create-2x7ph" event={"ID":"13f40860-6a7c-4bb0-83a3-6f61c6da561c","Type":"ContainerDied","Data":"aff67ca35ede7b2dbc88b693cb6d5462ed1884b592ce6f9767609de7af30f6f6"} Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.083219 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5432-account-create-2x7ph" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.083253 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aff67ca35ede7b2dbc88b693cb6d5462ed1884b592ce6f9767609de7af30f6f6" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.507175 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.543757 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.579034 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.658403 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv5nt\" (UniqueName: \"kubernetes.io/projected/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-kube-api-access-bv5nt\") pod \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.658552 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-config-data\") pod \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.658665 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6bcf\" (UniqueName: \"kubernetes.io/projected/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049-kube-api-access-c6bcf\") pod \"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049\" (UID: \"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049\") " Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.658754 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-combined-ca-bundle\") pod \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\" (UID: \"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f\") " Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.665164 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049-kube-api-access-c6bcf" (OuterVolumeSpecName: "kube-api-access-c6bcf") pod "e3fa6e71-f3b7-4b98-a760-10a4ae7ed049" (UID: "e3fa6e71-f3b7-4b98-a760-10a4ae7ed049"). InnerVolumeSpecName "kube-api-access-c6bcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.665347 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-kube-api-access-bv5nt" (OuterVolumeSpecName: "kube-api-access-bv5nt") pod "f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" (UID: "f5e0c8a9-ee98-44c8-95b2-9595ea834b9f"). InnerVolumeSpecName "kube-api-access-bv5nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.686955 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" (UID: "f5e0c8a9-ee98-44c8-95b2-9595ea834b9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.705117 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-config-data" (OuterVolumeSpecName: "config-data") pod "f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" (UID: "f5e0c8a9-ee98-44c8-95b2-9595ea834b9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.761098 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntbvz\" (UniqueName: \"kubernetes.io/projected/f4231276-2ab4-4cfd-bb47-850bf407d477-kube-api-access-ntbvz\") pod \"f4231276-2ab4-4cfd-bb47-850bf407d477\" (UID: \"f4231276-2ab4-4cfd-bb47-850bf407d477\") " Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.762033 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv5nt\" (UniqueName: \"kubernetes.io/projected/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-kube-api-access-bv5nt\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.762065 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.762083 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6bcf\" (UniqueName: \"kubernetes.io/projected/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049-kube-api-access-c6bcf\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.762096 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.765585 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4231276-2ab4-4cfd-bb47-850bf407d477-kube-api-access-ntbvz" (OuterVolumeSpecName: "kube-api-access-ntbvz") pod "f4231276-2ab4-4cfd-bb47-850bf407d477" (UID: "f4231276-2ab4-4cfd-bb47-850bf407d477"). InnerVolumeSpecName "kube-api-access-ntbvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:16 crc kubenswrapper[4958]: I1201 10:22:16.865219 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntbvz\" (UniqueName: \"kubernetes.io/projected/f4231276-2ab4-4cfd-bb47-850bf407d477-kube-api-access-ntbvz\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.096144 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-259f-account-create-fmfb6" event={"ID":"f4231276-2ab4-4cfd-bb47-850bf407d477","Type":"ContainerDied","Data":"b5358ed4a68eb4b11c38f5ad48cbe9949ccdbbcd9773cea377b83b4831001e4d"} Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.096680 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5358ed4a68eb4b11c38f5ad48cbe9949ccdbbcd9773cea377b83b4831001e4d" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.096192 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-259f-account-create-fmfb6" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.098259 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a095-account-create-pkl5w" event={"ID":"e3fa6e71-f3b7-4b98-a760-10a4ae7ed049","Type":"ContainerDied","Data":"dea85770b58c15a0e1aac7f437ecfde5dd7182b9f8573f7dd6a766d5b51e23c3"} Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.098286 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dea85770b58c15a0e1aac7f437ecfde5dd7182b9f8573f7dd6a766d5b51e23c3" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.098530 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a095-account-create-pkl5w" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.100405 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q4ccp" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.100449 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q4ccp" event={"ID":"f5e0c8a9-ee98-44c8-95b2-9595ea834b9f","Type":"ContainerDied","Data":"97117379b771560831cfe8053256b40a3f6480f048c1b89fc5970425ee6a756f"} Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.100508 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97117379b771560831cfe8053256b40a3f6480f048c1b89fc5970425ee6a756f" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.102372 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" event={"ID":"358d4d03-da3c-44c6-a4a4-32f6146fa1d9","Type":"ContainerStarted","Data":"5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12"} Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.102575 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.141342 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" podStartSLOduration=3.141318601 podStartE2EDuration="3.141318601s" podCreationTimestamp="2025-12-01 10:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:17.124104456 +0000 UTC m=+1384.632893493" watchObservedRunningTime="2025-12-01 10:22:17.141318601 +0000 UTC m=+1384.650107638" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.558702 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k75sq"] Dec 01 10:22:17 crc kubenswrapper[4958]: E1201 10:22:17.559518 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" containerName="keystone-db-sync" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.559547 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" containerName="keystone-db-sync" Dec 01 10:22:17 crc kubenswrapper[4958]: E1201 10:22:17.559571 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f40860-6a7c-4bb0-83a3-6f61c6da561c" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.559580 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f40860-6a7c-4bb0-83a3-6f61c6da561c" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: E1201 10:22:17.559599 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4231276-2ab4-4cfd-bb47-850bf407d477" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.559608 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4231276-2ab4-4cfd-bb47-850bf407d477" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: E1201 10:22:17.559637 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fa6e71-f3b7-4b98-a760-10a4ae7ed049" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.559649 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fa6e71-f3b7-4b98-a760-10a4ae7ed049" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.560014 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" containerName="keystone-db-sync" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.560055 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fa6e71-f3b7-4b98-a760-10a4ae7ed049" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.560069 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f40860-6a7c-4bb0-83a3-6f61c6da561c" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.560083 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4231276-2ab4-4cfd-bb47-850bf407d477" containerName="mariadb-account-create" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.560966 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.567544 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.573178 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.574502 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62dpp" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.582066 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.667480 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-credential-keys\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.667588 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-scripts\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.667698 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vzmg\" (UniqueName: \"kubernetes.io/projected/e866437f-6eec-468d-a421-cc967da03db1-kube-api-access-2vzmg\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.667768 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-combined-ca-bundle\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.667798 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-config-data\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.667947 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-fernet-keys\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.669947 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k75sq"] Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.746461 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vnkgp"] Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.776718 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-scripts\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.777170 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vzmg\" (UniqueName: \"kubernetes.io/projected/e866437f-6eec-468d-a421-cc967da03db1-kube-api-access-2vzmg\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.777306 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-combined-ca-bundle\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.777347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-config-data\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.777532 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-fernet-keys\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.777640 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-credential-keys\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.791354 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-scripts\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.795971 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-combined-ca-bundle\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.796246 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-fernet-keys\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.798443 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-credential-keys\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.806719 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-config-data\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.818533 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vzmg\" (UniqueName: \"kubernetes.io/projected/e866437f-6eec-468d-a421-cc967da03db1-kube-api-access-2vzmg\") pod \"keystone-bootstrap-k75sq\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.883385 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-bnscj"] Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.899209 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.923919 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-bnscj"] Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.987642 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.987699 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.987726 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-svc\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.987782 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-config\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.987923 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:17 crc kubenswrapper[4958]: I1201 10:22:17.987961 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxxqh\" (UniqueName: \"kubernetes.io/projected/69b2a8de-640a-4212-b033-7fa41997e1cd-kube-api-access-dxxqh\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.003272 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.004790 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.007410 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.011566 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.023214 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.049455 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-bnscj"] Dec 01 10:22:18 crc kubenswrapper[4958]: E1201 10:22:18.050610 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-dxxqh ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5959f8865f-bnscj" podUID="69b2a8de-640a-4212-b033-7fa41997e1cd" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.067182 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6864p"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.068881 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.074059 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s6l4v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.074293 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.074421 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.092656 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-config-data\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.092721 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a7373f-c8a8-4721-bf49-1ffc1887309e-logs\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.092776 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-config\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.092898 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.092948 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vhc2\" (UniqueName: \"kubernetes.io/projected/88a7373f-c8a8-4721-bf49-1ffc1887309e-kube-api-access-8vhc2\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093021 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-scripts\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-run-httpd\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093086 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-scripts\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093131 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrnzm\" (UniqueName: \"kubernetes.io/projected/62be9aa8-5618-470f-990a-448f46a926cf-kube-api-access-zrnzm\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093233 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093269 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxxqh\" (UniqueName: \"kubernetes.io/projected/69b2a8de-640a-4212-b033-7fa41997e1cd-kube-api-access-dxxqh\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093347 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-config-data\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093386 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-combined-ca-bundle\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093414 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093465 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093559 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-svc\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.093590 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-log-httpd\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.097013 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-config\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.097192 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.098118 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.098765 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-svc\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.099163 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.124665 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6864p"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.136468 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxxqh\" (UniqueName: \"kubernetes.io/projected/69b2a8de-640a-4212-b033-7fa41997e1cd-kube-api-access-dxxqh\") pod \"dnsmasq-dns-5959f8865f-bnscj\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.141605 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.143211 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-6frrm"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.153389 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.156907 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.174194 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-6frrm"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.186545 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8zf5v"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.188250 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.198701 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ld9kv" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.198755 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202486 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwlsq\" (UniqueName: \"kubernetes.io/projected/a3dc24bc-1938-490b-a88a-286e6af1d269-kube-api-access-dwlsq\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202576 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202616 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202665 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vhc2\" (UniqueName: \"kubernetes.io/projected/88a7373f-c8a8-4721-bf49-1ffc1887309e-kube-api-access-8vhc2\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202711 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-scripts\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202748 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-run-httpd\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202775 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-scripts\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202801 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9khx\" (UniqueName: \"kubernetes.io/projected/376e7f67-87de-4bab-91d1-5624ee49a979-kube-api-access-z9khx\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202829 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrnzm\" (UniqueName: \"kubernetes.io/projected/62be9aa8-5618-470f-990a-448f46a926cf-kube-api-access-zrnzm\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202880 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202912 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202939 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.202988 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-db-sync-config-data\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203024 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-config\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-config-data\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203132 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-combined-ca-bundle\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203177 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-log-httpd\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203233 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-combined-ca-bundle\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203280 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-config-data\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.203298 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a7373f-c8a8-4721-bf49-1ffc1887309e-logs\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.205637 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-run-httpd\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.207087 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a7373f-c8a8-4721-bf49-1ffc1887309e-logs\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.209313 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-log-httpd\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.209428 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-combined-ca-bundle\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.211167 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-scripts\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.211278 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8zf5v"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.216004 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-config-data\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.219263 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.221079 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.223318 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-scripts\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.226631 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-config-data\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.227730 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.234013 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qk422"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.240496 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.241646 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrnzm\" (UniqueName: \"kubernetes.io/projected/62be9aa8-5618-470f-990a-448f46a926cf-kube-api-access-zrnzm\") pod \"ceilometer-0\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.257625 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.257910 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.258140 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bbpvk" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.263994 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qk422"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.286390 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vhc2\" (UniqueName: \"kubernetes.io/projected/88a7373f-c8a8-4721-bf49-1ffc1887309e-kube-api-access-8vhc2\") pod \"placement-db-sync-6864p\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.306065 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-svc\") pod \"69b2a8de-640a-4212-b033-7fa41997e1cd\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.306126 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-sb\") pod \"69b2a8de-640a-4212-b033-7fa41997e1cd\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.306188 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-config\") pod \"69b2a8de-640a-4212-b033-7fa41997e1cd\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.306243 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-swift-storage-0\") pod \"69b2a8de-640a-4212-b033-7fa41997e1cd\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.306368 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-nb\") pod \"69b2a8de-640a-4212-b033-7fa41997e1cd\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.306409 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxxqh\" (UniqueName: \"kubernetes.io/projected/69b2a8de-640a-4212-b033-7fa41997e1cd-kube-api-access-dxxqh\") pod \"69b2a8de-640a-4212-b033-7fa41997e1cd\" (UID: \"69b2a8de-640a-4212-b033-7fa41997e1cd\") " Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307008 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-config-data\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307160 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78nml\" (UniqueName: \"kubernetes.io/projected/f960833c-3f07-4613-8d69-b7c563b3dd5d-kube-api-access-78nml\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307204 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-combined-ca-bundle\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307274 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwlsq\" (UniqueName: \"kubernetes.io/projected/a3dc24bc-1938-490b-a88a-286e6af1d269-kube-api-access-dwlsq\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307300 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f960833c-3f07-4613-8d69-b7c563b3dd5d-etc-machine-id\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307349 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-combined-ca-bundle\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307410 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307526 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9khx\" (UniqueName: \"kubernetes.io/projected/376e7f67-87de-4bab-91d1-5624ee49a979-kube-api-access-z9khx\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307564 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307592 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-db-sync-config-data\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307629 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-scripts\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307660 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307689 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-db-sync-config-data\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307712 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.307761 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-config\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.311103 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-config\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.311895 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69b2a8de-640a-4212-b033-7fa41997e1cd" (UID: "69b2a8de-640a-4212-b033-7fa41997e1cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.312342 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-config" (OuterVolumeSpecName: "config") pod "69b2a8de-640a-4212-b033-7fa41997e1cd" (UID: "69b2a8de-640a-4212-b033-7fa41997e1cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.312961 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.313528 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69b2a8de-640a-4212-b033-7fa41997e1cd" (UID: "69b2a8de-640a-4212-b033-7fa41997e1cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.314322 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69b2a8de-640a-4212-b033-7fa41997e1cd" (UID: "69b2a8de-640a-4212-b033-7fa41997e1cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.315204 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69b2a8de-640a-4212-b033-7fa41997e1cd" (UID: "69b2a8de-640a-4212-b033-7fa41997e1cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.316364 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.317653 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.318209 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.318681 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b2a8de-640a-4212-b033-7fa41997e1cd-kube-api-access-dxxqh" (OuterVolumeSpecName: "kube-api-access-dxxqh") pod "69b2a8de-640a-4212-b033-7fa41997e1cd" (UID: "69b2a8de-640a-4212-b033-7fa41997e1cd"). InnerVolumeSpecName "kube-api-access-dxxqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.320749 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-combined-ca-bundle\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.320800 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-db-sync-config-data\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.334163 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cktqn"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.362909 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9khx\" (UniqueName: \"kubernetes.io/projected/376e7f67-87de-4bab-91d1-5624ee49a979-kube-api-access-z9khx\") pod \"dnsmasq-dns-58dd9ff6bc-6frrm\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.381332 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.386194 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.388057 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwlsq\" (UniqueName: \"kubernetes.io/projected/a3dc24bc-1938-490b-a88a-286e6af1d269-kube-api-access-dwlsq\") pod \"barbican-db-sync-8zf5v\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.389713 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.421083 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6qlnd" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.422254 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-config-data\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.425332 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cktqn"] Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.425455 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78nml\" (UniqueName: \"kubernetes.io/projected/f960833c-3f07-4613-8d69-b7c563b3dd5d-kube-api-access-78nml\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.425656 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f960833c-3f07-4613-8d69-b7c563b3dd5d-etc-machine-id\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.425727 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-combined-ca-bundle\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.425970 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-db-sync-config-data\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426020 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-scripts\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426255 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426282 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426303 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426314 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426326 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69b2a8de-640a-4212-b033-7fa41997e1cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426342 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxxqh\" (UniqueName: \"kubernetes.io/projected/69b2a8de-640a-4212-b033-7fa41997e1cd-kube-api-access-dxxqh\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.426476 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f960833c-3f07-4613-8d69-b7c563b3dd5d-etc-machine-id\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.434080 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-db-sync-config-data\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.435178 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.443666 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-scripts\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.456832 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-combined-ca-bundle\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.458042 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78nml\" (UniqueName: \"kubernetes.io/projected/f960833c-3f07-4613-8d69-b7c563b3dd5d-kube-api-access-78nml\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.459583 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-config-data\") pod \"cinder-db-sync-qk422\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.507759 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6864p" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.528645 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-config\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.528729 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-combined-ca-bundle\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.528981 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qq7p\" (UniqueName: \"kubernetes.io/projected/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-kube-api-access-8qq7p\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.571307 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.597600 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.612729 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qk422" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.630761 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-config\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.630857 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-combined-ca-bundle\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.630965 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qq7p\" (UniqueName: \"kubernetes.io/projected/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-kube-api-access-8qq7p\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.637975 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-config\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.648491 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-combined-ca-bundle\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.663068 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qq7p\" (UniqueName: \"kubernetes.io/projected/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-kube-api-access-8qq7p\") pod \"neutron-db-sync-cktqn\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.787077 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:18 crc kubenswrapper[4958]: I1201 10:22:18.849110 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k75sq"] Dec 01 10:22:18 crc kubenswrapper[4958]: W1201 10:22:18.851280 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode866437f_6eec_468d_a421_cc967da03db1.slice/crio-5f1ef2848a807d93e272a836d5a613c9fc5587e35e7c240d846f333067104e76 WatchSource:0}: Error finding container 5f1ef2848a807d93e272a836d5a613c9fc5587e35e7c240d846f333067104e76: Status 404 returned error can't find the container with id 5f1ef2848a807d93e272a836d5a613c9fc5587e35e7c240d846f333067104e76 Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.266670 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-bnscj" Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.268241 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k75sq" event={"ID":"e866437f-6eec-468d-a421-cc967da03db1","Type":"ContainerStarted","Data":"5f1ef2848a807d93e272a836d5a613c9fc5587e35e7c240d846f333067104e76"} Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.268681 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerName="dnsmasq-dns" containerID="cri-o://5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12" gracePeriod=10 Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.273517 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.362207 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6864p"] Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.444522 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-bnscj"] Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.458663 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-bnscj"] Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.506663 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8zf5v"] Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.628417 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-6frrm"] Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.711867 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qk422"] Dec 01 10:22:19 crc kubenswrapper[4958]: W1201 10:22:19.722472 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf960833c_3f07_4613_8d69_b7c563b3dd5d.slice/crio-e375a610c469db4dff9fe7d098ca72f4e065b4ddc97556d46fb8427ea8e02343 WatchSource:0}: Error finding container e375a610c469db4dff9fe7d098ca72f4e065b4ddc97556d46fb8427ea8e02343: Status 404 returned error can't find the container with id e375a610c469db4dff9fe7d098ca72f4e065b4ddc97556d46fb8427ea8e02343 Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.781086 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cktqn"] Dec 01 10:22:19 crc kubenswrapper[4958]: W1201 10:22:19.789032 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ffec42_04be_4d06_b6d8_44c5b1eb1d53.slice/crio-2a2eec7c23b20be81a99db6ef8bbac94e2a7e22310373e210f80b3dc870a8873 WatchSource:0}: Error finding container 2a2eec7c23b20be81a99db6ef8bbac94e2a7e22310373e210f80b3dc870a8873: Status 404 returned error can't find the container with id 2a2eec7c23b20be81a99db6ef8bbac94e2a7e22310373e210f80b3dc870a8873 Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.814665 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b2a8de-640a-4212-b033-7fa41997e1cd" path="/var/lib/kubelet/pods/69b2a8de-640a-4212-b033-7fa41997e1cd/volumes" Dec 01 10:22:19 crc kubenswrapper[4958]: I1201 10:22:19.993251 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.048667 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-nb\") pod \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.049035 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-swift-storage-0\") pod \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.049191 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-svc\") pod \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.049666 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-sb\") pod \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.049744 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9xlt\" (UniqueName: \"kubernetes.io/projected/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-kube-api-access-x9xlt\") pod \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.049775 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-config\") pod \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\" (UID: \"358d4d03-da3c-44c6-a4a4-32f6146fa1d9\") " Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.106378 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-kube-api-access-x9xlt" (OuterVolumeSpecName: "kube-api-access-x9xlt") pod "358d4d03-da3c-44c6-a4a4-32f6146fa1d9" (UID: "358d4d03-da3c-44c6-a4a4-32f6146fa1d9"). InnerVolumeSpecName "kube-api-access-x9xlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.155590 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9xlt\" (UniqueName: \"kubernetes.io/projected/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-kube-api-access-x9xlt\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.189311 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "358d4d03-da3c-44c6-a4a4-32f6146fa1d9" (UID: "358d4d03-da3c-44c6-a4a4-32f6146fa1d9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.200516 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.258082 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.296289 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zf5v" event={"ID":"a3dc24bc-1938-490b-a88a-286e6af1d269","Type":"ContainerStarted","Data":"42a0e6e62eafc24b13f646543d7b96eaaea8bd5997aadc2261c996baf31ac700"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.302328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qk422" event={"ID":"f960833c-3f07-4613-8d69-b7c563b3dd5d","Type":"ContainerStarted","Data":"e375a610c469db4dff9fe7d098ca72f4e065b4ddc97556d46fb8427ea8e02343"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.305303 4958 generic.go:334] "Generic (PLEG): container finished" podID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerID="5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12" exitCode=0 Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.305359 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" event={"ID":"358d4d03-da3c-44c6-a4a4-32f6146fa1d9","Type":"ContainerDied","Data":"5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.305402 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" event={"ID":"358d4d03-da3c-44c6-a4a4-32f6146fa1d9","Type":"ContainerDied","Data":"0793517bb2464dda463ba258f06057e5eb04140cd31fddb02f569cae0ddc4fdb"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.305418 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-vnkgp" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.305438 4958 scope.go:117] "RemoveContainer" containerID="5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.307259 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" event={"ID":"376e7f67-87de-4bab-91d1-5624ee49a979","Type":"ContainerStarted","Data":"0132f0534053c14480c52707703eb5da32d9624e2a1e486b96acb0d1e79bf5fc"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.312524 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k75sq" event={"ID":"e866437f-6eec-468d-a421-cc967da03db1","Type":"ContainerStarted","Data":"f8e01024d3d1f7366ea417fc2cc49c8d648cbdcab196552e1b491adcaca19787"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.314568 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerStarted","Data":"bc77f0f8dd60f6d0fe868e7ee78f7db2c867f12ba469932074ce6643205e195e"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.317229 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cktqn" event={"ID":"81ffec42-04be-4d06-b6d8-44c5b1eb1d53","Type":"ContainerStarted","Data":"2a2eec7c23b20be81a99db6ef8bbac94e2a7e22310373e210f80b3dc870a8873"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.318801 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6864p" event={"ID":"88a7373f-c8a8-4721-bf49-1ffc1887309e","Type":"ContainerStarted","Data":"f3fd8dd027143b1b7f64ef4b9d92a757f58fbe4f74251866bc8bd0745e740a1f"} Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.343686 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k75sq" podStartSLOduration=3.343659643 podStartE2EDuration="3.343659643s" podCreationTimestamp="2025-12-01 10:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:20.339913886 +0000 UTC m=+1387.848702913" watchObservedRunningTime="2025-12-01 10:22:20.343659643 +0000 UTC m=+1387.852448680" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.459145 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "358d4d03-da3c-44c6-a4a4-32f6146fa1d9" (UID: "358d4d03-da3c-44c6-a4a4-32f6146fa1d9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.463432 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.487087 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "358d4d03-da3c-44c6-a4a4-32f6146fa1d9" (UID: "358d4d03-da3c-44c6-a4a4-32f6146fa1d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.494829 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-config" (OuterVolumeSpecName: "config") pod "358d4d03-da3c-44c6-a4a4-32f6146fa1d9" (UID: "358d4d03-da3c-44c6-a4a4-32f6146fa1d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.517726 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "358d4d03-da3c-44c6-a4a4-32f6146fa1d9" (UID: "358d4d03-da3c-44c6-a4a4-32f6146fa1d9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.520661 4958 scope.go:117] "RemoveContainer" containerID="11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.566242 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.566280 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.566295 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/358d4d03-da3c-44c6-a4a4-32f6146fa1d9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.590600 4958 scope.go:117] "RemoveContainer" containerID="5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12" Dec 01 10:22:20 crc kubenswrapper[4958]: E1201 10:22:20.596921 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12\": container with ID starting with 5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12 not found: ID does not exist" containerID="5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.597004 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12"} err="failed to get container status \"5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12\": rpc error: code = NotFound desc = could not find container \"5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12\": container with ID starting with 5fde18068adabdab5fd54e0f1fd886e21dcabc3f60fcd38d9230dacf2d936b12 not found: ID does not exist" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.597039 4958 scope.go:117] "RemoveContainer" containerID="11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6" Dec 01 10:22:20 crc kubenswrapper[4958]: E1201 10:22:20.597710 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6\": container with ID starting with 11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6 not found: ID does not exist" containerID="11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.597767 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6"} err="failed to get container status \"11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6\": rpc error: code = NotFound desc = could not find container \"11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6\": container with ID starting with 11ffc71d928f06a11d9f1087ae4610242eab2f7c6c9d9773032cbfbafcf090d6 not found: ID does not exist" Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.651033 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vnkgp"] Dec 01 10:22:20 crc kubenswrapper[4958]: I1201 10:22:20.665234 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-vnkgp"] Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.347522 4958 generic.go:334] "Generic (PLEG): container finished" podID="376e7f67-87de-4bab-91d1-5624ee49a979" containerID="f7afb037391afb9c05b2698507b1d518442bd7d03716e6c47a638e7e44493e16" exitCode=0 Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.348126 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" event={"ID":"376e7f67-87de-4bab-91d1-5624ee49a979","Type":"ContainerDied","Data":"f7afb037391afb9c05b2698507b1d518442bd7d03716e6c47a638e7e44493e16"} Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.354270 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksr7n" event={"ID":"b69312c4-bb8d-4272-bd60-77f355f83f25","Type":"ContainerStarted","Data":"67ff930450c58e6b7b81689867e0c06157ecda7661b4fa1b9f842961e72aa562"} Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.363255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cktqn" event={"ID":"81ffec42-04be-4d06-b6d8-44c5b1eb1d53","Type":"ContainerStarted","Data":"a0cfc1461041500cfd7cf071381e4add7d3a6ba36c83834977f794ded000977b"} Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.405656 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ksr7n" podStartSLOduration=5.064511351 podStartE2EDuration="38.40561588s" podCreationTimestamp="2025-12-01 10:21:43 +0000 UTC" firstStartedPulling="2025-12-01 10:21:45.144540554 +0000 UTC m=+1352.653329591" lastFinishedPulling="2025-12-01 10:22:18.485645083 +0000 UTC m=+1385.994434120" observedRunningTime="2025-12-01 10:22:21.40494327 +0000 UTC m=+1388.913732307" watchObservedRunningTime="2025-12-01 10:22:21.40561588 +0000 UTC m=+1388.914404917" Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.456894 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cktqn" podStartSLOduration=3.456862334 podStartE2EDuration="3.456862334s" podCreationTimestamp="2025-12-01 10:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:21.434374937 +0000 UTC m=+1388.943163994" watchObservedRunningTime="2025-12-01 10:22:21.456862334 +0000 UTC m=+1388.965651371" Dec 01 10:22:21 crc kubenswrapper[4958]: I1201 10:22:21.913738 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" path="/var/lib/kubelet/pods/358d4d03-da3c-44c6-a4a4-32f6146fa1d9/volumes" Dec 01 10:22:22 crc kubenswrapper[4958]: I1201 10:22:22.423124 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" event={"ID":"376e7f67-87de-4bab-91d1-5624ee49a979","Type":"ContainerStarted","Data":"334fedf56196b850baf942bdd4fc2e146a6c4fbff9ffbd3718e50b92cb4b6af2"} Dec 01 10:22:22 crc kubenswrapper[4958]: I1201 10:22:22.423253 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:22 crc kubenswrapper[4958]: I1201 10:22:22.460528 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" podStartSLOduration=5.460501542 podStartE2EDuration="5.460501542s" podCreationTimestamp="2025-12-01 10:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:22.454651244 +0000 UTC m=+1389.963440281" watchObservedRunningTime="2025-12-01 10:22:22.460501542 +0000 UTC m=+1389.969290579" Dec 01 10:22:24 crc kubenswrapper[4958]: I1201 10:22:24.451714 4958 generic.go:334] "Generic (PLEG): container finished" podID="e866437f-6eec-468d-a421-cc967da03db1" containerID="f8e01024d3d1f7366ea417fc2cc49c8d648cbdcab196552e1b491adcaca19787" exitCode=0 Dec 01 10:22:24 crc kubenswrapper[4958]: I1201 10:22:24.451825 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k75sq" event={"ID":"e866437f-6eec-468d-a421-cc967da03db1","Type":"ContainerDied","Data":"f8e01024d3d1f7366ea417fc2cc49c8d648cbdcab196552e1b491adcaca19787"} Dec 01 10:22:28 crc kubenswrapper[4958]: I1201 10:22:28.574482 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:22:28 crc kubenswrapper[4958]: I1201 10:22:28.645415 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qs6kc"] Dec 01 10:22:28 crc kubenswrapper[4958]: I1201 10:22:28.645732 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-qs6kc" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" containerID="cri-o://c77aa12b5c87710bd0d9b31aec69c271fb2357ed323a268969ec00abc58208e7" gracePeriod=10 Dec 01 10:22:29 crc kubenswrapper[4958]: I1201 10:22:29.830765 4958 generic.go:334] "Generic (PLEG): container finished" podID="ee13749a-e25a-439d-8da4-151757afbe82" containerID="c77aa12b5c87710bd0d9b31aec69c271fb2357ed323a268969ec00abc58208e7" exitCode=0 Dec 01 10:22:29 crc kubenswrapper[4958]: I1201 10:22:29.831504 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qs6kc" event={"ID":"ee13749a-e25a-439d-8da4-151757afbe82","Type":"ContainerDied","Data":"c77aa12b5c87710bd0d9b31aec69c271fb2357ed323a268969ec00abc58208e7"} Dec 01 10:22:30 crc kubenswrapper[4958]: I1201 10:22:30.729180 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qs6kc" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.689251 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.887983 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-credential-keys\") pod \"e866437f-6eec-468d-a421-cc967da03db1\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.888078 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-combined-ca-bundle\") pod \"e866437f-6eec-468d-a421-cc967da03db1\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.888147 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-scripts\") pod \"e866437f-6eec-468d-a421-cc967da03db1\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.888197 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-config-data\") pod \"e866437f-6eec-468d-a421-cc967da03db1\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.888294 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-fernet-keys\") pod \"e866437f-6eec-468d-a421-cc967da03db1\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.888341 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vzmg\" (UniqueName: \"kubernetes.io/projected/e866437f-6eec-468d-a421-cc967da03db1-kube-api-access-2vzmg\") pod \"e866437f-6eec-468d-a421-cc967da03db1\" (UID: \"e866437f-6eec-468d-a421-cc967da03db1\") " Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.898530 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-scripts" (OuterVolumeSpecName: "scripts") pod "e866437f-6eec-468d-a421-cc967da03db1" (UID: "e866437f-6eec-468d-a421-cc967da03db1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.903026 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e866437f-6eec-468d-a421-cc967da03db1" (UID: "e866437f-6eec-468d-a421-cc967da03db1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.914666 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e866437f-6eec-468d-a421-cc967da03db1-kube-api-access-2vzmg" (OuterVolumeSpecName: "kube-api-access-2vzmg") pod "e866437f-6eec-468d-a421-cc967da03db1" (UID: "e866437f-6eec-468d-a421-cc967da03db1"). InnerVolumeSpecName "kube-api-access-2vzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.941075 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e866437f-6eec-468d-a421-cc967da03db1" (UID: "e866437f-6eec-468d-a421-cc967da03db1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.949049 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k75sq" event={"ID":"e866437f-6eec-468d-a421-cc967da03db1","Type":"ContainerDied","Data":"5f1ef2848a807d93e272a836d5a613c9fc5587e35e7c240d846f333067104e76"} Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.949113 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1ef2848a807d93e272a836d5a613c9fc5587e35e7c240d846f333067104e76" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.949227 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k75sq" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.952709 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e866437f-6eec-468d-a421-cc967da03db1" (UID: "e866437f-6eec-468d-a421-cc967da03db1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.970382 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-config-data" (OuterVolumeSpecName: "config-data") pod "e866437f-6eec-468d-a421-cc967da03db1" (UID: "e866437f-6eec-468d-a421-cc967da03db1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.991965 4958 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.991999 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.992010 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.992021 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.992030 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e866437f-6eec-468d-a421-cc967da03db1-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:31 crc kubenswrapper[4958]: I1201 10:22:31.992040 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vzmg\" (UniqueName: \"kubernetes.io/projected/e866437f-6eec-468d-a421-cc967da03db1-kube-api-access-2vzmg\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:32 crc kubenswrapper[4958]: E1201 10:22:32.288462 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 01 10:22:32 crc kubenswrapper[4958]: E1201 10:22:32.288747 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c4h58bh67dh66h5cbh5b6h65ch598h57h5d5h68h5bbh5d7h5bh678hbdh59dh567hc5h655hbfh687h5dfhf5h5c7h648hdch66hfdh595h6ch698q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrnzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(62be9aa8-5618-470f-990a-448f46a926cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.518478 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k75sq"] Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.537124 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k75sq"] Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.553643 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ms7jr"] Dec 01 10:22:33 crc kubenswrapper[4958]: E1201 10:22:33.554235 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e866437f-6eec-468d-a421-cc967da03db1" containerName="keystone-bootstrap" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.554266 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e866437f-6eec-468d-a421-cc967da03db1" containerName="keystone-bootstrap" Dec 01 10:22:33 crc kubenswrapper[4958]: E1201 10:22:33.554295 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerName="init" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.554304 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerName="init" Dec 01 10:22:33 crc kubenswrapper[4958]: E1201 10:22:33.554326 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerName="dnsmasq-dns" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.554333 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerName="dnsmasq-dns" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.554536 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="358d4d03-da3c-44c6-a4a4-32f6146fa1d9" containerName="dnsmasq-dns" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.554558 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e866437f-6eec-468d-a421-cc967da03db1" containerName="keystone-bootstrap" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.555211 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.558680 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.559038 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.559979 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.560093 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62dpp" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.579083 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ms7jr"] Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.662743 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfww\" (UniqueName: \"kubernetes.io/projected/6c3f553a-d19b-494d-a009-18e2ba2aa110-kube-api-access-vhfww\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.662924 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-fernet-keys\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.662974 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-credential-keys\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.663012 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-combined-ca-bundle\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.663070 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-scripts\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.663110 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-config-data\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.764482 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-fernet-keys\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.764826 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-credential-keys\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.764985 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-combined-ca-bundle\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.765107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-scripts\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.765239 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-config-data\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.765909 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfww\" (UniqueName: \"kubernetes.io/projected/6c3f553a-d19b-494d-a009-18e2ba2aa110-kube-api-access-vhfww\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.771706 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-scripts\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.772083 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-config-data\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.772826 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-fernet-keys\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.773461 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-combined-ca-bundle\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.779562 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-credential-keys\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.784576 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfww\" (UniqueName: \"kubernetes.io/projected/6c3f553a-d19b-494d-a009-18e2ba2aa110-kube-api-access-vhfww\") pod \"keystone-bootstrap-ms7jr\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.820053 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e866437f-6eec-468d-a421-cc967da03db1" path="/var/lib/kubelet/pods/e866437f-6eec-468d-a421-cc967da03db1/volumes" Dec 01 10:22:33 crc kubenswrapper[4958]: I1201 10:22:33.935303 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:22:38 crc kubenswrapper[4958]: I1201 10:22:38.043826 4958 generic.go:334] "Generic (PLEG): container finished" podID="b69312c4-bb8d-4272-bd60-77f355f83f25" containerID="67ff930450c58e6b7b81689867e0c06157ecda7661b4fa1b9f842961e72aa562" exitCode=0 Dec 01 10:22:38 crc kubenswrapper[4958]: I1201 10:22:38.043981 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksr7n" event={"ID":"b69312c4-bb8d-4272-bd60-77f355f83f25","Type":"ContainerDied","Data":"67ff930450c58e6b7b81689867e0c06157ecda7661b4fa1b9f842961e72aa562"} Dec 01 10:22:40 crc kubenswrapper[4958]: I1201 10:22:40.728760 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qs6kc" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.335660 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.345921 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksr7n" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.501717 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-config\") pod \"ee13749a-e25a-439d-8da4-151757afbe82\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.501770 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-db-sync-config-data\") pod \"b69312c4-bb8d-4272-bd60-77f355f83f25\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.501836 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-config-data\") pod \"b69312c4-bb8d-4272-bd60-77f355f83f25\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.501881 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-combined-ca-bundle\") pod \"b69312c4-bb8d-4272-bd60-77f355f83f25\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.501961 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-nb\") pod \"ee13749a-e25a-439d-8da4-151757afbe82\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.503664 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4wh9\" (UniqueName: \"kubernetes.io/projected/b69312c4-bb8d-4272-bd60-77f355f83f25-kube-api-access-t4wh9\") pod \"b69312c4-bb8d-4272-bd60-77f355f83f25\" (UID: \"b69312c4-bb8d-4272-bd60-77f355f83f25\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.503859 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-dns-svc\") pod \"ee13749a-e25a-439d-8da4-151757afbe82\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.504048 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-sb\") pod \"ee13749a-e25a-439d-8da4-151757afbe82\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.504079 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g625d\" (UniqueName: \"kubernetes.io/projected/ee13749a-e25a-439d-8da4-151757afbe82-kube-api-access-g625d\") pod \"ee13749a-e25a-439d-8da4-151757afbe82\" (UID: \"ee13749a-e25a-439d-8da4-151757afbe82\") " Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.509175 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b69312c4-bb8d-4272-bd60-77f355f83f25-kube-api-access-t4wh9" (OuterVolumeSpecName: "kube-api-access-t4wh9") pod "b69312c4-bb8d-4272-bd60-77f355f83f25" (UID: "b69312c4-bb8d-4272-bd60-77f355f83f25"). InnerVolumeSpecName "kube-api-access-t4wh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.510185 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee13749a-e25a-439d-8da4-151757afbe82-kube-api-access-g625d" (OuterVolumeSpecName: "kube-api-access-g625d") pod "ee13749a-e25a-439d-8da4-151757afbe82" (UID: "ee13749a-e25a-439d-8da4-151757afbe82"). InnerVolumeSpecName "kube-api-access-g625d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.510484 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b69312c4-bb8d-4272-bd60-77f355f83f25" (UID: "b69312c4-bb8d-4272-bd60-77f355f83f25"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.538544 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b69312c4-bb8d-4272-bd60-77f355f83f25" (UID: "b69312c4-bb8d-4272-bd60-77f355f83f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.563967 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee13749a-e25a-439d-8da4-151757afbe82" (UID: "ee13749a-e25a-439d-8da4-151757afbe82"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.566034 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-config-data" (OuterVolumeSpecName: "config-data") pod "b69312c4-bb8d-4272-bd60-77f355f83f25" (UID: "b69312c4-bb8d-4272-bd60-77f355f83f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.567051 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee13749a-e25a-439d-8da4-151757afbe82" (UID: "ee13749a-e25a-439d-8da4-151757afbe82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.582827 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee13749a-e25a-439d-8da4-151757afbe82" (UID: "ee13749a-e25a-439d-8da4-151757afbe82"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.582892 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-config" (OuterVolumeSpecName: "config") pod "ee13749a-e25a-439d-8da4-151757afbe82" (UID: "ee13749a-e25a-439d-8da4-151757afbe82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.606920 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.606964 4958 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.606976 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.606987 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b69312c4-bb8d-4272-bd60-77f355f83f25-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.607002 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.607012 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4wh9\" (UniqueName: \"kubernetes.io/projected/b69312c4-bb8d-4272-bd60-77f355f83f25-kube-api-access-t4wh9\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.607022 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.607031 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g625d\" (UniqueName: \"kubernetes.io/projected/ee13749a-e25a-439d-8da4-151757afbe82-kube-api-access-g625d\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:42 crc kubenswrapper[4958]: I1201 10:22:42.607043 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee13749a-e25a-439d-8da4-151757afbe82-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.133160 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksr7n" event={"ID":"b69312c4-bb8d-4272-bd60-77f355f83f25","Type":"ContainerDied","Data":"eebc851ba7683e27e014381221709d9a02fafb7903b574d7fd16564705acebda"} Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.133211 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eebc851ba7683e27e014381221709d9a02fafb7903b574d7fd16564705acebda" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.133298 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksr7n" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.138134 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qs6kc" event={"ID":"ee13749a-e25a-439d-8da4-151757afbe82","Type":"ContainerDied","Data":"28be34d84a8be3a62dd54c32b8ccebe657b144ad714d6da0ab68968d48421a24"} Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.138206 4958 scope.go:117] "RemoveContainer" containerID="c77aa12b5c87710bd0d9b31aec69c271fb2357ed323a268969ec00abc58208e7" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.138277 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qs6kc" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.195004 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qs6kc"] Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.204129 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qs6kc"] Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.814511 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee13749a-e25a-439d-8da4-151757afbe82" path="/var/lib/kubelet/pods/ee13749a-e25a-439d-8da4-151757afbe82/volumes" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.878460 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2mtdq"] Dec 01 10:22:43 crc kubenswrapper[4958]: E1201 10:22:43.881679 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b69312c4-bb8d-4272-bd60-77f355f83f25" containerName="glance-db-sync" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.881726 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b69312c4-bb8d-4272-bd60-77f355f83f25" containerName="glance-db-sync" Dec 01 10:22:43 crc kubenswrapper[4958]: E1201 10:22:43.881767 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="init" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.881782 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="init" Dec 01 10:22:43 crc kubenswrapper[4958]: E1201 10:22:43.881802 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.881809 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.882023 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.882055 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b69312c4-bb8d-4272-bd60-77f355f83f25" containerName="glance-db-sync" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.885169 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:43 crc kubenswrapper[4958]: I1201 10:22:43.980892 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2mtdq"] Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.005519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.005612 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.005637 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.005667 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.005815 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-config\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.005951 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fljk\" (UniqueName: \"kubernetes.io/projected/a8eced95-37bf-4219-bd34-e543ebad21a6-kube-api-access-8fljk\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.108475 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fljk\" (UniqueName: \"kubernetes.io/projected/a8eced95-37bf-4219-bd34-e543ebad21a6-kube-api-access-8fljk\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.108756 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.108982 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.109008 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.109072 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.109201 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-config\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.110568 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.111327 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.111447 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.111999 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.112627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-config\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.132214 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fljk\" (UniqueName: \"kubernetes.io/projected/a8eced95-37bf-4219-bd34-e543ebad21a6-kube-api-access-8fljk\") pod \"dnsmasq-dns-785d8bcb8c-2mtdq\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.221012 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.836904 4958 scope.go:117] "RemoveContainer" containerID="1780ad2c68c6bed13e8f4a981f509569f90328850865c3a3d0e0ac2efc955cd1" Dec 01 10:22:44 crc kubenswrapper[4958]: E1201 10:22:44.860279 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 10:22:44 crc kubenswrapper[4958]: E1201 10:22:44.860488 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78nml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qk422_openstack(f960833c-3f07-4613-8d69-b7c563b3dd5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:22:44 crc kubenswrapper[4958]: E1201 10:22:44.861659 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qk422" podUID="f960833c-3f07-4613-8d69-b7c563b3dd5d" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.902759 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.904979 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.910760 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-smrjw" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.911152 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.912375 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 10:22:44 crc kubenswrapper[4958]: I1201 10:22:44.978176 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.031499 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-config-data\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.032065 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-scripts\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.032090 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.032131 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.032156 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-logs\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.032185 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.032205 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thb2h\" (UniqueName: \"kubernetes.io/projected/7bb598f5-3dd4-40ef-8098-191280bc4d18-kube-api-access-thb2h\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.034480 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.038105 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.048517 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.059888 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135106 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135178 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-config-data\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135214 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-logs\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135251 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135284 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rjr\" (UniqueName: \"kubernetes.io/projected/26ebeb4c-f867-45a5-894c-9f10f14c870e-kube-api-access-w5rjr\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135314 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135338 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135361 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-scripts\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135386 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135415 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135442 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135469 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-logs\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135497 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.135519 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thb2h\" (UniqueName: \"kubernetes.io/projected/7bb598f5-3dd4-40ef-8098-191280bc4d18-kube-api-access-thb2h\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.136473 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.137518 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-logs\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.137595 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.146180 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.162250 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thb2h\" (UniqueName: \"kubernetes.io/projected/7bb598f5-3dd4-40ef-8098-191280bc4d18-kube-api-access-thb2h\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.169213 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-config-data\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.178495 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-scripts\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.196456 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: E1201 10:22:45.208407 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qk422" podUID="f960833c-3f07-4613-8d69-b7c563b3dd5d" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.242770 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.242881 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-logs\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.242927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.242970 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rjr\" (UniqueName: \"kubernetes.io/projected/26ebeb4c-f867-45a5-894c-9f10f14c870e-kube-api-access-w5rjr\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.243016 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.243052 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.243102 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.243593 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.245435 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-logs\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.246027 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.264952 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.275421 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.280278 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rjr\" (UniqueName: \"kubernetes.io/projected/26ebeb4c-f867-45a5-894c-9f10f14c870e-kube-api-access-w5rjr\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.281147 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.290143 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.308369 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.380631 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.455739 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2mtdq"] Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.501681 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ms7jr"] Dec 01 10:22:45 crc kubenswrapper[4958]: I1201 10:22:45.730212 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-qs6kc" podUID="ee13749a-e25a-439d-8da4-151757afbe82" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.231489 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6864p" event={"ID":"88a7373f-c8a8-4721-bf49-1ffc1887309e","Type":"ContainerStarted","Data":"4b8027571ba9ea338e49eff27a4a3599eeda73550a750f774a719fece40e20fe"} Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.254630 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zf5v" event={"ID":"a3dc24bc-1938-490b-a88a-286e6af1d269","Type":"ContainerStarted","Data":"bc7066c75cdf8d7cc442eef94e9c6386e682d8da59456b2e98254a41c5f417c9"} Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.300598 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" event={"ID":"a8eced95-37bf-4219-bd34-e543ebad21a6","Type":"ContainerStarted","Data":"d436c1e2bec283f80ba2eba8079dc3ec5a89dafc3c34f22da51c5a4232a1b415"} Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.300692 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" event={"ID":"a8eced95-37bf-4219-bd34-e543ebad21a6","Type":"ContainerStarted","Data":"06c7d08ce5fb176e5ba905ff79275a9a34dfaae6f49878a1023a1a6b9b45e1b4"} Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.317725 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ms7jr" event={"ID":"6c3f553a-d19b-494d-a009-18e2ba2aa110","Type":"ContainerStarted","Data":"472532cca7defb53f2736d76284443280764652622f60a06947378c292479037"} Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.395364 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.399155 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cktqn" event={"ID":"81ffec42-04be-4d06-b6d8-44c5b1eb1d53","Type":"ContainerDied","Data":"a0cfc1461041500cfd7cf071381e4add7d3a6ba36c83834977f794ded000977b"} Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.399152 4958 generic.go:334] "Generic (PLEG): container finished" podID="81ffec42-04be-4d06-b6d8-44c5b1eb1d53" containerID="a0cfc1461041500cfd7cf071381e4add7d3a6ba36c83834977f794ded000977b" exitCode=0 Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.430387 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6864p" podStartSLOduration=6.595310705 podStartE2EDuration="29.430350484s" podCreationTimestamp="2025-12-01 10:22:17 +0000 UTC" firstStartedPulling="2025-12-01 10:22:19.393351491 +0000 UTC m=+1386.902140528" lastFinishedPulling="2025-12-01 10:22:42.22839127 +0000 UTC m=+1409.737180307" observedRunningTime="2025-12-01 10:22:46.269206067 +0000 UTC m=+1413.777995104" watchObservedRunningTime="2025-12-01 10:22:46.430350484 +0000 UTC m=+1413.939139521" Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.476137 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8zf5v" podStartSLOduration=3.143448507 podStartE2EDuration="28.476104541s" podCreationTimestamp="2025-12-01 10:22:18 +0000 UTC" firstStartedPulling="2025-12-01 10:22:19.523710391 +0000 UTC m=+1387.032499418" lastFinishedPulling="2025-12-01 10:22:44.856366415 +0000 UTC m=+1412.365155452" observedRunningTime="2025-12-01 10:22:46.300285412 +0000 UTC m=+1413.809074459" watchObservedRunningTime="2025-12-01 10:22:46.476104541 +0000 UTC m=+1413.984893578" Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.514792 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ms7jr" podStartSLOduration=13.514761993 podStartE2EDuration="13.514761993s" podCreationTimestamp="2025-12-01 10:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:46.380349605 +0000 UTC m=+1413.889138632" watchObservedRunningTime="2025-12-01 10:22:46.514761993 +0000 UTC m=+1414.023551040" Dec 01 10:22:46 crc kubenswrapper[4958]: I1201 10:22:46.578962 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:46 crc kubenswrapper[4958]: W1201 10:22:46.610363 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bb598f5_3dd4_40ef_8098_191280bc4d18.slice/crio-a8b30cc9b33757fd9d0589e66b89abb35fb7adea536e901232beb257889ad9c5 WatchSource:0}: Error finding container a8b30cc9b33757fd9d0589e66b89abb35fb7adea536e901232beb257889ad9c5: Status 404 returned error can't find the container with id a8b30cc9b33757fd9d0589e66b89abb35fb7adea536e901232beb257889ad9c5 Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.141509 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.234253 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.448760 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"26ebeb4c-f867-45a5-894c-9f10f14c870e","Type":"ContainerStarted","Data":"71440ae53f7a19889f5b00664b31ee8fc903a694de88cef9e40ef774dd39911c"} Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.466126 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerStarted","Data":"00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d"} Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.476073 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7bb598f5-3dd4-40ef-8098-191280bc4d18","Type":"ContainerStarted","Data":"a8b30cc9b33757fd9d0589e66b89abb35fb7adea536e901232beb257889ad9c5"} Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.488256 4958 generic.go:334] "Generic (PLEG): container finished" podID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerID="d436c1e2bec283f80ba2eba8079dc3ec5a89dafc3c34f22da51c5a4232a1b415" exitCode=0 Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.488396 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" event={"ID":"a8eced95-37bf-4219-bd34-e543ebad21a6","Type":"ContainerDied","Data":"d436c1e2bec283f80ba2eba8079dc3ec5a89dafc3c34f22da51c5a4232a1b415"} Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.488429 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" event={"ID":"a8eced95-37bf-4219-bd34-e543ebad21a6","Type":"ContainerStarted","Data":"3546c9e568f9ac554dcd36fe0e4457387b261ba4d72c2e98a44041487a6b1410"} Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.488806 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.516834 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ms7jr" event={"ID":"6c3f553a-d19b-494d-a009-18e2ba2aa110","Type":"ContainerStarted","Data":"01d50f6c5d2b0216863f0e7c400427a1bc20e5a49e0b2002b3e0226d6cf84d7c"} Dec 01 10:22:47 crc kubenswrapper[4958]: I1201 10:22:47.537122 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" podStartSLOduration=4.537090979 podStartE2EDuration="4.537090979s" podCreationTimestamp="2025-12-01 10:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:47.523323583 +0000 UTC m=+1415.032112620" watchObservedRunningTime="2025-12-01 10:22:47.537090979 +0000 UTC m=+1415.045880016" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.117922 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.261729 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qq7p\" (UniqueName: \"kubernetes.io/projected/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-kube-api-access-8qq7p\") pod \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.261833 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-config\") pod \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.262149 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-combined-ca-bundle\") pod \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\" (UID: \"81ffec42-04be-4d06-b6d8-44c5b1eb1d53\") " Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.268168 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-kube-api-access-8qq7p" (OuterVolumeSpecName: "kube-api-access-8qq7p") pod "81ffec42-04be-4d06-b6d8-44c5b1eb1d53" (UID: "81ffec42-04be-4d06-b6d8-44c5b1eb1d53"). InnerVolumeSpecName "kube-api-access-8qq7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.294099 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-config" (OuterVolumeSpecName: "config") pod "81ffec42-04be-4d06-b6d8-44c5b1eb1d53" (UID: "81ffec42-04be-4d06-b6d8-44c5b1eb1d53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.296210 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ffec42-04be-4d06-b6d8-44c5b1eb1d53" (UID: "81ffec42-04be-4d06-b6d8-44c5b1eb1d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.368545 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.368597 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qq7p\" (UniqueName: \"kubernetes.io/projected/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-kube-api-access-8qq7p\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.368613 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ffec42-04be-4d06-b6d8-44c5b1eb1d53-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.560382 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cktqn" event={"ID":"81ffec42-04be-4d06-b6d8-44c5b1eb1d53","Type":"ContainerDied","Data":"2a2eec7c23b20be81a99db6ef8bbac94e2a7e22310373e210f80b3dc870a8873"} Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.560439 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2eec7c23b20be81a99db6ef8bbac94e2a7e22310373e210f80b3dc870a8873" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.560517 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cktqn" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.574113 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7bb598f5-3dd4-40ef-8098-191280bc4d18","Type":"ContainerStarted","Data":"8ffa4ef480522806af76c3a36d0bde9518212fbc93b8c924bceca9fd7786a38d"} Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.603997 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"26ebeb4c-f867-45a5-894c-9f10f14c870e","Type":"ContainerStarted","Data":"a3b4aa1b3a1f3e82650d5649022a9aa26a249a955eb52e564dfedbc3b058b7b4"} Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.706635 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2mtdq"] Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.774799 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-q7fvg"] Dec 01 10:22:48 crc kubenswrapper[4958]: E1201 10:22:48.775769 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ffec42-04be-4d06-b6d8-44c5b1eb1d53" containerName="neutron-db-sync" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.775802 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ffec42-04be-4d06-b6d8-44c5b1eb1d53" containerName="neutron-db-sync" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.776138 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ffec42-04be-4d06-b6d8-44c5b1eb1d53" containerName="neutron-db-sync" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.782338 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.797861 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bff445bfd-mg9pk"] Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.819368 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.819519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.819561 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-config\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.819593 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.819634 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-przrv\" (UniqueName: \"kubernetes.io/projected/d0ac5474-7911-4334-9a06-760d22e9d524-kube-api-access-przrv\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.819692 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-svc\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.822841 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.824543 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-q7fvg"] Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.830028 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.830308 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.833413 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.840019 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6qlnd" Dec 01 10:22:48 crc kubenswrapper[4958]: I1201 10:22:48.868686 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bff445bfd-mg9pk"] Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.089466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.090893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.091302 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-config\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.091552 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.092474 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-przrv\" (UniqueName: \"kubernetes.io/projected/d0ac5474-7911-4334-9a06-760d22e9d524-kube-api-access-przrv\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.092719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-svc\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.092921 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.095091 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.097106 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.099353 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-config\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.110781 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-svc\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.177756 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-przrv\" (UniqueName: \"kubernetes.io/projected/d0ac5474-7911-4334-9a06-760d22e9d524-kube-api-access-przrv\") pod \"dnsmasq-dns-55f844cf75-q7fvg\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.198927 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crgfd\" (UniqueName: \"kubernetes.io/projected/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-kube-api-access-crgfd\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.199059 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-ovndb-tls-certs\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.199116 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-combined-ca-bundle\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.199158 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-config\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.199191 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-httpd-config\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.303179 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-combined-ca-bundle\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.303291 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-config\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.303324 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-httpd-config\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.303432 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crgfd\" (UniqueName: \"kubernetes.io/projected/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-kube-api-access-crgfd\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.303531 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-ovndb-tls-certs\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.310558 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-config\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.319310 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-combined-ca-bundle\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.320215 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-ovndb-tls-certs\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.325059 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-httpd-config\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.336813 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crgfd\" (UniqueName: \"kubernetes.io/projected/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-kube-api-access-crgfd\") pod \"neutron-bff445bfd-mg9pk\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.455401 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.493659 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:49 crc kubenswrapper[4958]: I1201 10:22:49.628293 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerName="dnsmasq-dns" containerID="cri-o://3546c9e568f9ac554dcd36fe0e4457387b261ba4d72c2e98a44041487a6b1410" gracePeriod=10 Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.046242 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-q7fvg"] Dec 01 10:22:50 crc kubenswrapper[4958]: W1201 10:22:50.067502 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0ac5474_7911_4334_9a06_760d22e9d524.slice/crio-7ba77042a730599f93e4bce1f5c9db6158e3a0737e9e08ac822d935a0a8436ff WatchSource:0}: Error finding container 7ba77042a730599f93e4bce1f5c9db6158e3a0737e9e08ac822d935a0a8436ff: Status 404 returned error can't find the container with id 7ba77042a730599f93e4bce1f5c9db6158e3a0737e9e08ac822d935a0a8436ff Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.650581 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"26ebeb4c-f867-45a5-894c-9f10f14c870e","Type":"ContainerStarted","Data":"4cb474fef6e974394935b5701490f31aac53b71311f3dccc52e137e3d3cd32f5"} Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.650759 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-log" containerID="cri-o://a3b4aa1b3a1f3e82650d5649022a9aa26a249a955eb52e564dfedbc3b058b7b4" gracePeriod=30 Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.651486 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-httpd" containerID="cri-o://4cb474fef6e974394935b5701490f31aac53b71311f3dccc52e137e3d3cd32f5" gracePeriod=30 Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.674553 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7bb598f5-3dd4-40ef-8098-191280bc4d18","Type":"ContainerStarted","Data":"43a06800529e0fa26d0936d2740b8ec37ac138b8c3ee89642c2e9b5ae5370ff3"} Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.674922 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-log" containerID="cri-o://8ffa4ef480522806af76c3a36d0bde9518212fbc93b8c924bceca9fd7786a38d" gracePeriod=30 Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.675469 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-httpd" containerID="cri-o://43a06800529e0fa26d0936d2740b8ec37ac138b8c3ee89642c2e9b5ae5370ff3" gracePeriod=30 Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.688612 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" event={"ID":"d0ac5474-7911-4334-9a06-760d22e9d524","Type":"ContainerStarted","Data":"7ba77042a730599f93e4bce1f5c9db6158e3a0737e9e08ac822d935a0a8436ff"} Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.694660 4958 generic.go:334] "Generic (PLEG): container finished" podID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerID="3546c9e568f9ac554dcd36fe0e4457387b261ba4d72c2e98a44041487a6b1410" exitCode=0 Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.694733 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" event={"ID":"a8eced95-37bf-4219-bd34-e543ebad21a6","Type":"ContainerDied","Data":"3546c9e568f9ac554dcd36fe0e4457387b261ba4d72c2e98a44041487a6b1410"} Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.721557 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.721518105 podStartE2EDuration="7.721518105s" podCreationTimestamp="2025-12-01 10:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:50.705246737 +0000 UTC m=+1418.214035774" watchObservedRunningTime="2025-12-01 10:22:50.721518105 +0000 UTC m=+1418.230307142" Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.784603 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bff445bfd-mg9pk"] Dec 01 10:22:50 crc kubenswrapper[4958]: I1201 10:22:50.795457 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.795424982 podStartE2EDuration="7.795424982s" podCreationTimestamp="2025-12-01 10:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:50.765285935 +0000 UTC m=+1418.274074972" watchObservedRunningTime="2025-12-01 10:22:50.795424982 +0000 UTC m=+1418.304214019" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.664050 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.852585 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-config\") pod \"a8eced95-37bf-4219-bd34-e543ebad21a6\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.852974 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-nb\") pod \"a8eced95-37bf-4219-bd34-e543ebad21a6\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.853504 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-sb\") pod \"a8eced95-37bf-4219-bd34-e543ebad21a6\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.853650 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-svc\") pod \"a8eced95-37bf-4219-bd34-e543ebad21a6\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.853748 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fljk\" (UniqueName: \"kubernetes.io/projected/a8eced95-37bf-4219-bd34-e543ebad21a6-kube-api-access-8fljk\") pod \"a8eced95-37bf-4219-bd34-e543ebad21a6\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.853950 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-swift-storage-0\") pod \"a8eced95-37bf-4219-bd34-e543ebad21a6\" (UID: \"a8eced95-37bf-4219-bd34-e543ebad21a6\") " Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.869714 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8eced95-37bf-4219-bd34-e543ebad21a6-kube-api-access-8fljk" (OuterVolumeSpecName: "kube-api-access-8fljk") pod "a8eced95-37bf-4219-bd34-e543ebad21a6" (UID: "a8eced95-37bf-4219-bd34-e543ebad21a6"). InnerVolumeSpecName "kube-api-access-8fljk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.889154 4958 generic.go:334] "Generic (PLEG): container finished" podID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerID="43a06800529e0fa26d0936d2740b8ec37ac138b8c3ee89642c2e9b5ae5370ff3" exitCode=0 Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.889207 4958 generic.go:334] "Generic (PLEG): container finished" podID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerID="8ffa4ef480522806af76c3a36d0bde9518212fbc93b8c924bceca9fd7786a38d" exitCode=143 Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.889321 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7bb598f5-3dd4-40ef-8098-191280bc4d18","Type":"ContainerDied","Data":"43a06800529e0fa26d0936d2740b8ec37ac138b8c3ee89642c2e9b5ae5370ff3"} Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.889363 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7bb598f5-3dd4-40ef-8098-191280bc4d18","Type":"ContainerDied","Data":"8ffa4ef480522806af76c3a36d0bde9518212fbc93b8c924bceca9fd7786a38d"} Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.905987 4958 generic.go:334] "Generic (PLEG): container finished" podID="88a7373f-c8a8-4721-bf49-1ffc1887309e" containerID="4b8027571ba9ea338e49eff27a4a3599eeda73550a750f774a719fece40e20fe" exitCode=0 Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.906095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6864p" event={"ID":"88a7373f-c8a8-4721-bf49-1ffc1887309e","Type":"ContainerDied","Data":"4b8027571ba9ea338e49eff27a4a3599eeda73550a750f774a719fece40e20fe"} Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.947618 4958 generic.go:334] "Generic (PLEG): container finished" podID="d0ac5474-7911-4334-9a06-760d22e9d524" containerID="b8c6d02f07e1ac6fb7fb78825570ffb5d59f66d76818bc92450f9f78d930a90a" exitCode=0 Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.948239 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" event={"ID":"d0ac5474-7911-4334-9a06-760d22e9d524","Type":"ContainerDied","Data":"b8c6d02f07e1ac6fb7fb78825570ffb5d59f66d76818bc92450f9f78d930a90a"} Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.952285 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8eced95-37bf-4219-bd34-e543ebad21a6" (UID: "a8eced95-37bf-4219-bd34-e543ebad21a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.953763 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8eced95-37bf-4219-bd34-e543ebad21a6" (UID: "a8eced95-37bf-4219-bd34-e543ebad21a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.960425 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.960476 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fljk\" (UniqueName: \"kubernetes.io/projected/a8eced95-37bf-4219-bd34-e543ebad21a6-kube-api-access-8fljk\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.960493 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.968635 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-config" (OuterVolumeSpecName: "config") pod "a8eced95-37bf-4219-bd34-e543ebad21a6" (UID: "a8eced95-37bf-4219-bd34-e543ebad21a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.977274 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bff445bfd-mg9pk" event={"ID":"4777ce5e-b8d9-4486-9ebb-a8521c18dd15","Type":"ContainerStarted","Data":"f25d73ad8101d9d94fbe58d238fad7c269a1d2a06d0269d8951bb908cc344ee5"} Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.997299 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" event={"ID":"a8eced95-37bf-4219-bd34-e543ebad21a6","Type":"ContainerDied","Data":"06c7d08ce5fb176e5ba905ff79275a9a34dfaae6f49878a1023a1a6b9b45e1b4"} Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.997395 4958 scope.go:117] "RemoveContainer" containerID="3546c9e568f9ac554dcd36fe0e4457387b261ba4d72c2e98a44041487a6b1410" Dec 01 10:22:51 crc kubenswrapper[4958]: I1201 10:22:51.997627 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2mtdq" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.002234 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8eced95-37bf-4219-bd34-e543ebad21a6" (UID: "a8eced95-37bf-4219-bd34-e543ebad21a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.024011 4958 generic.go:334] "Generic (PLEG): container finished" podID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerID="4cb474fef6e974394935b5701490f31aac53b71311f3dccc52e137e3d3cd32f5" exitCode=0 Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.024057 4958 generic.go:334] "Generic (PLEG): container finished" podID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerID="a3b4aa1b3a1f3e82650d5649022a9aa26a249a955eb52e564dfedbc3b058b7b4" exitCode=143 Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.024080 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"26ebeb4c-f867-45a5-894c-9f10f14c870e","Type":"ContainerDied","Data":"4cb474fef6e974394935b5701490f31aac53b71311f3dccc52e137e3d3cd32f5"} Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.024169 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"26ebeb4c-f867-45a5-894c-9f10f14c870e","Type":"ContainerDied","Data":"a3b4aa1b3a1f3e82650d5649022a9aa26a249a955eb52e564dfedbc3b058b7b4"} Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.073080 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.073135 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.096640 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a8eced95-37bf-4219-bd34-e543ebad21a6" (UID: "a8eced95-37bf-4219-bd34-e543ebad21a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.123285 4958 scope.go:117] "RemoveContainer" containerID="d436c1e2bec283f80ba2eba8079dc3ec5a89dafc3c34f22da51c5a4232a1b415" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.180732 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8eced95-37bf-4219-bd34-e543ebad21a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.391913 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2mtdq"] Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.406638 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2mtdq"] Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.521103 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.591985 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5rjr\" (UniqueName: \"kubernetes.io/projected/26ebeb4c-f867-45a5-894c-9f10f14c870e-kube-api-access-w5rjr\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.592066 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-logs\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.592136 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.592554 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-config-data\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.592587 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-combined-ca-bundle\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.592631 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-httpd-run\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.592665 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-scripts\") pod \"26ebeb4c-f867-45a5-894c-9f10f14c870e\" (UID: \"26ebeb4c-f867-45a5-894c-9f10f14c870e\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.595572 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.597289 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-logs" (OuterVolumeSpecName: "logs") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.602119 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.607318 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ebeb4c-f867-45a5-894c-9f10f14c870e-kube-api-access-w5rjr" (OuterVolumeSpecName: "kube-api-access-w5rjr") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "kube-api-access-w5rjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.607450 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-scripts" (OuterVolumeSpecName: "scripts") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.653218 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.658115 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.682943 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-config-data" (OuterVolumeSpecName: "config-data") pod "26ebeb4c-f867-45a5-894c-9f10f14c870e" (UID: "26ebeb4c-f867-45a5-894c-9f10f14c870e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701829 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701895 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701910 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701922 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26ebeb4c-f867-45a5-894c-9f10f14c870e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701934 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5rjr\" (UniqueName: \"kubernetes.io/projected/26ebeb4c-f867-45a5-894c-9f10f14c870e-kube-api-access-w5rjr\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701946 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26ebeb4c-f867-45a5-894c-9f10f14c870e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.701993 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.752507 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.808080 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-config-data\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.808742 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.808812 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-scripts\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.808895 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-combined-ca-bundle\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.808956 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-logs\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.808981 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-httpd-run\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.809173 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thb2h\" (UniqueName: \"kubernetes.io/projected/7bb598f5-3dd4-40ef-8098-191280bc4d18-kube-api-access-thb2h\") pod \"7bb598f5-3dd4-40ef-8098-191280bc4d18\" (UID: \"7bb598f5-3dd4-40ef-8098-191280bc4d18\") " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.809718 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.809769 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-logs" (OuterVolumeSpecName: "logs") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.810717 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.813889 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.814292 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-scripts" (OuterVolumeSpecName: "scripts") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.814953 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb598f5-3dd4-40ef-8098-191280bc4d18-kube-api-access-thb2h" (OuterVolumeSpecName: "kube-api-access-thb2h") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "kube-api-access-thb2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.865626 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.870782 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-config-data" (OuterVolumeSpecName: "config-data") pod "7bb598f5-3dd4-40ef-8098-191280bc4d18" (UID: "7bb598f5-3dd4-40ef-8098-191280bc4d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.913265 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thb2h\" (UniqueName: \"kubernetes.io/projected/7bb598f5-3dd4-40ef-8098-191280bc4d18-kube-api-access-thb2h\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.913339 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.913423 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.913442 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.937125 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bb598f5-3dd4-40ef-8098-191280bc4d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.937149 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.937158 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7bb598f5-3dd4-40ef-8098-191280bc4d18-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:52 crc kubenswrapper[4958]: I1201 10:22:52.954382 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.039762 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.042394 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bff445bfd-mg9pk" event={"ID":"4777ce5e-b8d9-4486-9ebb-a8521c18dd15","Type":"ContainerStarted","Data":"0800009f1f87e2b0e639aa1625f201a95856480de1a188eb073dc4dc47524d2a"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.042459 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bff445bfd-mg9pk" event={"ID":"4777ce5e-b8d9-4486-9ebb-a8521c18dd15","Type":"ContainerStarted","Data":"c4e4a17cabd5aa2978dceb71175d7f962d7c7ec975783f5fd25c57bb34cf6b2f"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.042543 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.049428 4958 generic.go:334] "Generic (PLEG): container finished" podID="6c3f553a-d19b-494d-a009-18e2ba2aa110" containerID="01d50f6c5d2b0216863f0e7c400427a1bc20e5a49e0b2002b3e0226d6cf84d7c" exitCode=0 Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.049531 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ms7jr" event={"ID":"6c3f553a-d19b-494d-a009-18e2ba2aa110","Type":"ContainerDied","Data":"01d50f6c5d2b0216863f0e7c400427a1bc20e5a49e0b2002b3e0226d6cf84d7c"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.053619 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"26ebeb4c-f867-45a5-894c-9f10f14c870e","Type":"ContainerDied","Data":"71440ae53f7a19889f5b00664b31ee8fc903a694de88cef9e40ef774dd39911c"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.053699 4958 scope.go:117] "RemoveContainer" containerID="4cb474fef6e974394935b5701490f31aac53b71311f3dccc52e137e3d3cd32f5" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.053987 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.057409 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7bb598f5-3dd4-40ef-8098-191280bc4d18","Type":"ContainerDied","Data":"a8b30cc9b33757fd9d0589e66b89abb35fb7adea536e901232beb257889ad9c5"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.057554 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.059551 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" event={"ID":"d0ac5474-7911-4334-9a06-760d22e9d524","Type":"ContainerStarted","Data":"88c650102fea04cd849b5aec8074db6efd829b1782dfbc53cab75f51c83e96f7"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.060159 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.075183 4958 generic.go:334] "Generic (PLEG): container finished" podID="a3dc24bc-1938-490b-a88a-286e6af1d269" containerID="bc7066c75cdf8d7cc442eef94e9c6386e682d8da59456b2e98254a41c5f417c9" exitCode=0 Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.075237 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zf5v" event={"ID":"a3dc24bc-1938-490b-a88a-286e6af1d269","Type":"ContainerDied","Data":"bc7066c75cdf8d7cc442eef94e9c6386e682d8da59456b2e98254a41c5f417c9"} Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.077833 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bff445bfd-mg9pk" podStartSLOduration=5.077809283 podStartE2EDuration="5.077809283s" podCreationTimestamp="2025-12-01 10:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:53.072806939 +0000 UTC m=+1420.581595976" watchObservedRunningTime="2025-12-01 10:22:53.077809283 +0000 UTC m=+1420.586598320" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.113774 4958 scope.go:117] "RemoveContainer" containerID="a3b4aa1b3a1f3e82650d5649022a9aa26a249a955eb52e564dfedbc3b058b7b4" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.123249 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" podStartSLOduration=5.12322342 podStartE2EDuration="5.12322342s" podCreationTimestamp="2025-12-01 10:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:22:53.121617654 +0000 UTC m=+1420.630406691" watchObservedRunningTime="2025-12-01 10:22:53.12322342 +0000 UTC m=+1420.632012457" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.173445 4958 scope.go:117] "RemoveContainer" containerID="43a06800529e0fa26d0936d2740b8ec37ac138b8c3ee89642c2e9b5ae5370ff3" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.275164 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.312219 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.339499 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.361556 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: E1201 10:22:53.363376 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-httpd" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363399 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-httpd" Dec 01 10:22:53 crc kubenswrapper[4958]: E1201 10:22:53.363414 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerName="init" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363422 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerName="init" Dec 01 10:22:53 crc kubenswrapper[4958]: E1201 10:22:53.363438 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-log" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363444 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-log" Dec 01 10:22:53 crc kubenswrapper[4958]: E1201 10:22:53.363454 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-log" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363463 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-log" Dec 01 10:22:53 crc kubenswrapper[4958]: E1201 10:22:53.363484 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-httpd" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363490 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-httpd" Dec 01 10:22:53 crc kubenswrapper[4958]: E1201 10:22:53.363505 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerName="dnsmasq-dns" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363513 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerName="dnsmasq-dns" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363723 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-log" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363738 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" containerName="dnsmasq-dns" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363747 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-log" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363758 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" containerName="glance-httpd" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.363768 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" containerName="glance-httpd" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.365553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.368813 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.369034 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.369169 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-smrjw" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.372874 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.375994 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.385552 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.411225 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.413990 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.423086 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.423281 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458164 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458337 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458487 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458596 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458669 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458723 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458819 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.458881 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj22c\" (UniqueName: \"kubernetes.io/projected/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-kube-api-access-mj22c\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.459393 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.570871 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.570952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.570975 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571002 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571036 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571126 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571201 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571235 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571263 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-logs\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571288 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqk7\" (UniqueName: \"kubernetes.io/projected/c0a0701a-31c0-41fb-ba17-1919483e6248-kube-api-access-wjqk7\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571325 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj22c\" (UniqueName: \"kubernetes.io/projected/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-kube-api-access-mj22c\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571398 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571425 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571462 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571565 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.571838 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.573533 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-logs\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.574290 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.610282 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.612135 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.624229 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj22c\" (UniqueName: \"kubernetes.io/projected/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-kube-api-access-mj22c\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.624901 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.638132 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674260 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674280 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674304 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674358 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674375 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-logs\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.674392 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqk7\" (UniqueName: \"kubernetes.io/projected/c0a0701a-31c0-41fb-ba17-1919483e6248-kube-api-access-wjqk7\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.679394 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-logs\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.679384 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.679732 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.695140 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.713701 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-config-data\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.724098 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqk7\" (UniqueName: \"kubernetes.io/projected/c0a0701a-31c0-41fb-ba17-1919483e6248-kube-api-access-wjqk7\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.724595 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-scripts\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.728116 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.729304 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.732284 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.764544 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " pod="openstack/glance-default-external-api-0" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.846267 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ebeb4c-f867-45a5-894c-9f10f14c870e" path="/var/lib/kubelet/pods/26ebeb4c-f867-45a5-894c-9f10f14c870e/volumes" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.847835 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb598f5-3dd4-40ef-8098-191280bc4d18" path="/var/lib/kubelet/pods/7bb598f5-3dd4-40ef-8098-191280bc4d18/volumes" Dec 01 10:22:53 crc kubenswrapper[4958]: I1201 10:22:53.848826 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8eced95-37bf-4219-bd34-e543ebad21a6" path="/var/lib/kubelet/pods/a8eced95-37bf-4219-bd34-e543ebad21a6/volumes" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.055862 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.194121 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-768454b56f-84xc8"] Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.196549 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.199540 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.208300 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.208733 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-768454b56f-84xc8"] Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.298627 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-internal-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.298722 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-ovndb-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.298767 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-config\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.298890 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-httpd-config\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.298952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-public-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.299000 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzjbr\" (UniqueName: \"kubernetes.io/projected/305e96f8-0597-4c64-9026-6d6f2aa454d4-kube-api-access-lzjbr\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.299048 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-combined-ca-bundle\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401581 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-config\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-httpd-config\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401723 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-public-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401755 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzjbr\" (UniqueName: \"kubernetes.io/projected/305e96f8-0597-4c64-9026-6d6f2aa454d4-kube-api-access-lzjbr\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401788 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-combined-ca-bundle\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401865 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-internal-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.401899 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-ovndb-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.410606 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-combined-ca-bundle\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.411587 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-public-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.412911 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-config\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.415378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-httpd-config\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.417233 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-ovndb-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.420766 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-internal-tls-certs\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.442215 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzjbr\" (UniqueName: \"kubernetes.io/projected/305e96f8-0597-4c64-9026-6d6f2aa454d4-kube-api-access-lzjbr\") pod \"neutron-768454b56f-84xc8\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:54 crc kubenswrapper[4958]: I1201 10:22:54.560539 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:22:59 crc kubenswrapper[4958]: I1201 10:22:59.458068 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:22:59 crc kubenswrapper[4958]: I1201 10:22:59.567074 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-6frrm"] Dec 01 10:22:59 crc kubenswrapper[4958]: I1201 10:22:59.567473 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" containerName="dnsmasq-dns" containerID="cri-o://334fedf56196b850baf942bdd4fc2e146a6c4fbff9ffbd3718e50b92cb4b6af2" gracePeriod=10 Dec 01 10:22:59 crc kubenswrapper[4958]: I1201 10:22:59.723759 4958 scope.go:117] "RemoveContainer" containerID="8ffa4ef480522806af76c3a36d0bde9518212fbc93b8c924bceca9fd7786a38d" Dec 01 10:22:59 crc kubenswrapper[4958]: E1201 10:22:59.788497 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod376e7f67_87de_4bab_91d1_5624ee49a979.slice/crio-334fedf56196b850baf942bdd4fc2e146a6c4fbff9ffbd3718e50b92cb4b6af2.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.063717 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.074205 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6864p" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.080235 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174465 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfww\" (UniqueName: \"kubernetes.io/projected/6c3f553a-d19b-494d-a009-18e2ba2aa110-kube-api-access-vhfww\") pod \"6c3f553a-d19b-494d-a009-18e2ba2aa110\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174541 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a7373f-c8a8-4721-bf49-1ffc1887309e-logs\") pod \"88a7373f-c8a8-4721-bf49-1ffc1887309e\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174598 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-db-sync-config-data\") pod \"a3dc24bc-1938-490b-a88a-286e6af1d269\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174626 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vhc2\" (UniqueName: \"kubernetes.io/projected/88a7373f-c8a8-4721-bf49-1ffc1887309e-kube-api-access-8vhc2\") pod \"88a7373f-c8a8-4721-bf49-1ffc1887309e\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174659 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-config-data\") pod \"88a7373f-c8a8-4721-bf49-1ffc1887309e\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174740 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-combined-ca-bundle\") pod \"a3dc24bc-1938-490b-a88a-286e6af1d269\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174887 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-combined-ca-bundle\") pod \"6c3f553a-d19b-494d-a009-18e2ba2aa110\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174914 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-fernet-keys\") pod \"6c3f553a-d19b-494d-a009-18e2ba2aa110\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.174960 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-config-data\") pod \"6c3f553a-d19b-494d-a009-18e2ba2aa110\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175008 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-combined-ca-bundle\") pod \"88a7373f-c8a8-4721-bf49-1ffc1887309e\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175060 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-scripts\") pod \"88a7373f-c8a8-4721-bf49-1ffc1887309e\" (UID: \"88a7373f-c8a8-4721-bf49-1ffc1887309e\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175069 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a7373f-c8a8-4721-bf49-1ffc1887309e-logs" (OuterVolumeSpecName: "logs") pod "88a7373f-c8a8-4721-bf49-1ffc1887309e" (UID: "88a7373f-c8a8-4721-bf49-1ffc1887309e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175115 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-scripts\") pod \"6c3f553a-d19b-494d-a009-18e2ba2aa110\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175143 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwlsq\" (UniqueName: \"kubernetes.io/projected/a3dc24bc-1938-490b-a88a-286e6af1d269-kube-api-access-dwlsq\") pod \"a3dc24bc-1938-490b-a88a-286e6af1d269\" (UID: \"a3dc24bc-1938-490b-a88a-286e6af1d269\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175190 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-credential-keys\") pod \"6c3f553a-d19b-494d-a009-18e2ba2aa110\" (UID: \"6c3f553a-d19b-494d-a009-18e2ba2aa110\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.175591 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a7373f-c8a8-4721-bf49-1ffc1887309e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.183407 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-scripts" (OuterVolumeSpecName: "scripts") pod "6c3f553a-d19b-494d-a009-18e2ba2aa110" (UID: "6c3f553a-d19b-494d-a009-18e2ba2aa110"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.183827 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6c3f553a-d19b-494d-a009-18e2ba2aa110" (UID: "6c3f553a-d19b-494d-a009-18e2ba2aa110"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.183951 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a3dc24bc-1938-490b-a88a-286e6af1d269" (UID: "a3dc24bc-1938-490b-a88a-286e6af1d269"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.184765 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a7373f-c8a8-4721-bf49-1ffc1887309e-kube-api-access-8vhc2" (OuterVolumeSpecName: "kube-api-access-8vhc2") pod "88a7373f-c8a8-4721-bf49-1ffc1887309e" (UID: "88a7373f-c8a8-4721-bf49-1ffc1887309e"). InnerVolumeSpecName "kube-api-access-8vhc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.199987 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6c3f553a-d19b-494d-a009-18e2ba2aa110" (UID: "6c3f553a-d19b-494d-a009-18e2ba2aa110"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.204905 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-scripts" (OuterVolumeSpecName: "scripts") pod "88a7373f-c8a8-4721-bf49-1ffc1887309e" (UID: "88a7373f-c8a8-4721-bf49-1ffc1887309e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.207104 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3f553a-d19b-494d-a009-18e2ba2aa110-kube-api-access-vhfww" (OuterVolumeSpecName: "kube-api-access-vhfww") pod "6c3f553a-d19b-494d-a009-18e2ba2aa110" (UID: "6c3f553a-d19b-494d-a009-18e2ba2aa110"). InnerVolumeSpecName "kube-api-access-vhfww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.207630 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dc24bc-1938-490b-a88a-286e6af1d269-kube-api-access-dwlsq" (OuterVolumeSpecName: "kube-api-access-dwlsq") pod "a3dc24bc-1938-490b-a88a-286e6af1d269" (UID: "a3dc24bc-1938-490b-a88a-286e6af1d269"). InnerVolumeSpecName "kube-api-access-dwlsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.217669 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ms7jr" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.218260 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ms7jr" event={"ID":"6c3f553a-d19b-494d-a009-18e2ba2aa110","Type":"ContainerDied","Data":"472532cca7defb53f2736d76284443280764652622f60a06947378c292479037"} Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.218307 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472532cca7defb53f2736d76284443280764652622f60a06947378c292479037" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.224042 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-config-data" (OuterVolumeSpecName: "config-data") pod "6c3f553a-d19b-494d-a009-18e2ba2aa110" (UID: "6c3f553a-d19b-494d-a009-18e2ba2aa110"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.224560 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3dc24bc-1938-490b-a88a-286e6af1d269" (UID: "a3dc24bc-1938-490b-a88a-286e6af1d269"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.224585 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" event={"ID":"376e7f67-87de-4bab-91d1-5624ee49a979","Type":"ContainerDied","Data":"334fedf56196b850baf942bdd4fc2e146a6c4fbff9ffbd3718e50b92cb4b6af2"} Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.224311 4958 generic.go:334] "Generic (PLEG): container finished" podID="376e7f67-87de-4bab-91d1-5624ee49a979" containerID="334fedf56196b850baf942bdd4fc2e146a6c4fbff9ffbd3718e50b92cb4b6af2" exitCode=0 Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.236379 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-config-data" (OuterVolumeSpecName: "config-data") pod "88a7373f-c8a8-4721-bf49-1ffc1887309e" (UID: "88a7373f-c8a8-4721-bf49-1ffc1887309e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.240799 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6864p" event={"ID":"88a7373f-c8a8-4721-bf49-1ffc1887309e","Type":"ContainerDied","Data":"f3fd8dd027143b1b7f64ef4b9d92a757f58fbe4f74251866bc8bd0745e740a1f"} Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.240962 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3fd8dd027143b1b7f64ef4b9d92a757f58fbe4f74251866bc8bd0745e740a1f" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.241063 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6864p" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.245767 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8zf5v" event={"ID":"a3dc24bc-1938-490b-a88a-286e6af1d269","Type":"ContainerDied","Data":"42a0e6e62eafc24b13f646543d7b96eaaea8bd5997aadc2261c996baf31ac700"} Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.245825 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42a0e6e62eafc24b13f646543d7b96eaaea8bd5997aadc2261c996baf31ac700" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.245919 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8zf5v" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.277080 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a7373f-c8a8-4721-bf49-1ffc1887309e" (UID: "88a7373f-c8a8-4721-bf49-1ffc1887309e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281816 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281918 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281936 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281950 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwlsq\" (UniqueName: \"kubernetes.io/projected/a3dc24bc-1938-490b-a88a-286e6af1d269-kube-api-access-dwlsq\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281968 4958 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281982 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfww\" (UniqueName: \"kubernetes.io/projected/6c3f553a-d19b-494d-a009-18e2ba2aa110-kube-api-access-vhfww\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.281997 4958 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.282035 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vhc2\" (UniqueName: \"kubernetes.io/projected/88a7373f-c8a8-4721-bf49-1ffc1887309e-kube-api-access-8vhc2\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.282051 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7373f-c8a8-4721-bf49-1ffc1887309e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.282092 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc24bc-1938-490b-a88a-286e6af1d269-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.282103 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.282115 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.303177 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c3f553a-d19b-494d-a009-18e2ba2aa110" (UID: "6c3f553a-d19b-494d-a009-18e2ba2aa110"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.386826 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c3f553a-d19b-494d-a009-18e2ba2aa110-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.498667 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.595819 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-swift-storage-0\") pod \"376e7f67-87de-4bab-91d1-5624ee49a979\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.595942 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-sb\") pod \"376e7f67-87de-4bab-91d1-5624ee49a979\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.595990 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-config\") pod \"376e7f67-87de-4bab-91d1-5624ee49a979\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.596058 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-nb\") pod \"376e7f67-87de-4bab-91d1-5624ee49a979\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.596128 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-svc\") pod \"376e7f67-87de-4bab-91d1-5624ee49a979\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.596177 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9khx\" (UniqueName: \"kubernetes.io/projected/376e7f67-87de-4bab-91d1-5624ee49a979-kube-api-access-z9khx\") pod \"376e7f67-87de-4bab-91d1-5624ee49a979\" (UID: \"376e7f67-87de-4bab-91d1-5624ee49a979\") " Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.652190 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376e7f67-87de-4bab-91d1-5624ee49a979-kube-api-access-z9khx" (OuterVolumeSpecName: "kube-api-access-z9khx") pod "376e7f67-87de-4bab-91d1-5624ee49a979" (UID: "376e7f67-87de-4bab-91d1-5624ee49a979"). InnerVolumeSpecName "kube-api-access-z9khx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.699175 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9khx\" (UniqueName: \"kubernetes.io/projected/376e7f67-87de-4bab-91d1-5624ee49a979-kube-api-access-z9khx\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.801741 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "376e7f67-87de-4bab-91d1-5624ee49a979" (UID: "376e7f67-87de-4bab-91d1-5624ee49a979"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.804278 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.816702 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "376e7f67-87de-4bab-91d1-5624ee49a979" (UID: "376e7f67-87de-4bab-91d1-5624ee49a979"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.836428 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "376e7f67-87de-4bab-91d1-5624ee49a979" (UID: "376e7f67-87de-4bab-91d1-5624ee49a979"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.846636 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "376e7f67-87de-4bab-91d1-5624ee49a979" (UID: "376e7f67-87de-4bab-91d1-5624ee49a979"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.874424 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-config" (OuterVolumeSpecName: "config") pod "376e7f67-87de-4bab-91d1-5624ee49a979" (UID: "376e7f67-87de-4bab-91d1-5624ee49a979"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.906575 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.906621 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.906637 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.906652 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/376e7f67-87de-4bab-91d1-5624ee49a979-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.923537 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:00 crc kubenswrapper[4958]: I1201 10:23:00.998798 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-768454b56f-84xc8"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.273042 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.278135 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-6frrm" event={"ID":"376e7f67-87de-4bab-91d1-5624ee49a979","Type":"ContainerDied","Data":"0132f0534053c14480c52707703eb5da32d9624e2a1e486b96acb0d1e79bf5fc"} Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.278220 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f6877488-9nzb2"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.282318 4958 scope.go:117] "RemoveContainer" containerID="334fedf56196b850baf942bdd4fc2e146a6c4fbff9ffbd3718e50b92cb4b6af2" Dec 01 10:23:01 crc kubenswrapper[4958]: E1201 10:23:01.291558 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" containerName="init" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.291623 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" containerName="init" Dec 01 10:23:01 crc kubenswrapper[4958]: E1201 10:23:01.291730 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" containerName="dnsmasq-dns" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.291749 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" containerName="dnsmasq-dns" Dec 01 10:23:01 crc kubenswrapper[4958]: E1201 10:23:01.291781 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dc24bc-1938-490b-a88a-286e6af1d269" containerName="barbican-db-sync" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.291793 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dc24bc-1938-490b-a88a-286e6af1d269" containerName="barbican-db-sync" Dec 01 10:23:01 crc kubenswrapper[4958]: E1201 10:23:01.291826 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3f553a-d19b-494d-a009-18e2ba2aa110" containerName="keystone-bootstrap" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.291836 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3f553a-d19b-494d-a009-18e2ba2aa110" containerName="keystone-bootstrap" Dec 01 10:23:01 crc kubenswrapper[4958]: E1201 10:23:01.291896 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a7373f-c8a8-4721-bf49-1ffc1887309e" containerName="placement-db-sync" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.291905 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a7373f-c8a8-4721-bf49-1ffc1887309e" containerName="placement-db-sync" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.293959 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" containerName="dnsmasq-dns" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.294218 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a7373f-c8a8-4721-bf49-1ffc1887309e" containerName="placement-db-sync" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.294239 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dc24bc-1938-490b-a88a-286e6af1d269" containerName="barbican-db-sync" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.294453 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3f553a-d19b-494d-a009-18e2ba2aa110" containerName="keystone-bootstrap" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.300268 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0a0701a-31c0-41fb-ba17-1919483e6248","Type":"ContainerStarted","Data":"dec08121b3df3ca5ed69f73b62573608d9253d17dcc314f15f7399a9134328f5"} Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.300968 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.316677 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.317040 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.317299 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62dpp" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.317513 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.318207 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.322623 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerStarted","Data":"b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275"} Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.323534 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325146 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-fernet-keys\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325194 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-scripts\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325223 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-public-tls-certs\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325272 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-combined-ca-bundle\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325352 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-config-data\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325412 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-internal-tls-certs\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5vj\" (UniqueName: \"kubernetes.io/projected/caf6f35f-9e82-4899-801f-3b5e94189f6d-kube-api-access-6b5vj\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.325482 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-credential-keys\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.341129 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8d6888456-hv67t"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.348110 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-768454b56f-84xc8" event={"ID":"305e96f8-0597-4c64-9026-6d6f2aa454d4","Type":"ContainerStarted","Data":"dbfebfbd52c8c0ea998333ec0ccefde14e6c7bdf7bb724774781839cf2aaa114"} Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.348298 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.363563 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.364785 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.366029 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.366269 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-s6l4v" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.376983 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d6888456-hv67t"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.377233 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.394947 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f6877488-9nzb2"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.430644 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n8gv\" (UniqueName: \"kubernetes.io/projected/207148af-4b76-49b6-80cc-883ec14bb268-kube-api-access-5n8gv\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.432099 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-internal-tls-certs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.432272 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-config-data\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.443947 4958 scope.go:117] "RemoveContainer" containerID="f7afb037391afb9c05b2698507b1d518442bd7d03716e6c47a638e7e44493e16" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447067 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-public-tls-certs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447208 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-internal-tls-certs\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447422 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5vj\" (UniqueName: \"kubernetes.io/projected/caf6f35f-9e82-4899-801f-3b5e94189f6d-kube-api-access-6b5vj\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447526 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-credential-keys\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447660 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-fernet-keys\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447763 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-config-data\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447828 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-scripts\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.447903 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-public-tls-certs\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.448004 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/207148af-4b76-49b6-80cc-883ec14bb268-logs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.448132 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-combined-ca-bundle\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.448191 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-scripts\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.448439 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-combined-ca-bundle\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.457879 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-config-data\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.474106 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-fernet-keys\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.474636 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-credential-keys\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.476272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-scripts\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.476951 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-public-tls-certs\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.477020 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-internal-tls-certs\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.484659 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-combined-ca-bundle\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.494509 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5vj\" (UniqueName: \"kubernetes.io/projected/caf6f35f-9e82-4899-801f-3b5e94189f6d-kube-api-access-6b5vj\") pod \"keystone-5f6877488-9nzb2\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.517317 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-6frrm"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.536957 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-6frrm"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.552560 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-config-data\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.552700 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/207148af-4b76-49b6-80cc-883ec14bb268-logs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.552753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-combined-ca-bundle\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.552781 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-scripts\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.552868 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n8gv\" (UniqueName: \"kubernetes.io/projected/207148af-4b76-49b6-80cc-883ec14bb268-kube-api-access-5n8gv\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.552933 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-internal-tls-certs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.553014 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-public-tls-certs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.555396 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/207148af-4b76-49b6-80cc-883ec14bb268-logs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.577185 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-public-tls-certs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.593162 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-internal-tls-certs\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.594069 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-combined-ca-bundle\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.594982 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-config-data\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.600240 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-scripts\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.651788 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n8gv\" (UniqueName: \"kubernetes.io/projected/207148af-4b76-49b6-80cc-883ec14bb268-kube-api-access-5n8gv\") pod \"placement-8d6888456-hv67t\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.656328 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-85fc6f9f59-gdn47"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.658602 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.684630 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.685073 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ld9kv" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.685410 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.715042 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.718298 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.722377 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.749291 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.761483 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.778327 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data-custom\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.778392 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-combined-ca-bundle\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.778519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceca985e-bec2-42fe-9758-edad828586c2-logs\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.778744 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c0931c-8919-4128-97bb-21c5872d5cf0-logs\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.778941 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.779006 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm594\" (UniqueName: \"kubernetes.io/projected/ceca985e-bec2-42fe-9758-edad828586c2-kube-api-access-nm594\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.779039 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfls\" (UniqueName: \"kubernetes.io/projected/81c0931c-8919-4128-97bb-21c5872d5cf0-kube-api-access-jbfls\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.779161 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.779178 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data-custom\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.779222 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-combined-ca-bundle\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.792552 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-85fc6f9f59-gdn47"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.880416 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376e7f67-87de-4bab-91d1-5624ee49a979" path="/var/lib/kubelet/pods/376e7f67-87de-4bab-91d1-5624ee49a979/volumes" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.882739 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd"] Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884088 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884170 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm594\" (UniqueName: \"kubernetes.io/projected/ceca985e-bec2-42fe-9758-edad828586c2-kube-api-access-nm594\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884221 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfls\" (UniqueName: \"kubernetes.io/projected/81c0931c-8919-4128-97bb-21c5872d5cf0-kube-api-access-jbfls\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884338 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data-custom\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884428 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-combined-ca-bundle\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884458 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data-custom\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884504 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-combined-ca-bundle\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884548 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceca985e-bec2-42fe-9758-edad828586c2-logs\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.884697 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c0931c-8919-4128-97bb-21c5872d5cf0-logs\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.896115 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceca985e-bec2-42fe-9758-edad828586c2-logs\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.896645 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c0931c-8919-4128-97bb-21c5872d5cf0-logs\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.941336 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.945213 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data-custom\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.946224 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.953729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-combined-ca-bundle\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.956560 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data-custom\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.960247 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-combined-ca-bundle\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.960446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfls\" (UniqueName: \"kubernetes.io/projected/81c0931c-8919-4128-97bb-21c5872d5cf0-kube-api-access-jbfls\") pod \"barbican-worker-85fc6f9f59-gdn47\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.980404 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm594\" (UniqueName: \"kubernetes.io/projected/ceca985e-bec2-42fe-9758-edad828586c2-kube-api-access-nm594\") pod \"barbican-keystone-listener-6ccf6d66f4-d9wbd\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:01 crc kubenswrapper[4958]: I1201 10:23:01.992211 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kpf2"] Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.021011 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.024285 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.100255 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.100320 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-config\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.100385 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.100428 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.100545 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.100584 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6dh\" (UniqueName: \"kubernetes.io/projected/2ac54227-1acb-4bbb-8acb-5f7bba007274-kube-api-access-7j6dh\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.104764 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kpf2"] Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.124010 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.140223 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7545f98668-s6xmj"] Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.145967 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.146472 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.153086 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.167394 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7545f98668-s6xmj"] Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202442 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltb9\" (UniqueName: \"kubernetes.io/projected/c93f0461-fbb7-4446-99e2-d58df0038b16-kube-api-access-zltb9\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202511 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202580 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-combined-ca-bundle\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202635 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202672 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6dh\" (UniqueName: \"kubernetes.io/projected/2ac54227-1acb-4bbb-8acb-5f7bba007274-kube-api-access-7j6dh\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202725 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data-custom\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202759 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202781 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202808 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-config\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202825 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93f0461-fbb7-4446-99e2-d58df0038b16-logs\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.202905 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.204126 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.205076 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.206617 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.207347 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.207480 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-config\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.233235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6dh\" (UniqueName: \"kubernetes.io/projected/2ac54227-1acb-4bbb-8acb-5f7bba007274-kube-api-access-7j6dh\") pod \"dnsmasq-dns-85ff748b95-8kpf2\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.312547 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-combined-ca-bundle\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.313261 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data-custom\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.314045 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.314132 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93f0461-fbb7-4446-99e2-d58df0038b16-logs\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.314219 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltb9\" (UniqueName: \"kubernetes.io/projected/c93f0461-fbb7-4446-99e2-d58df0038b16-kube-api-access-zltb9\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.315351 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93f0461-fbb7-4446-99e2-d58df0038b16-logs\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.323065 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-combined-ca-bundle\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.337726 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltb9\" (UniqueName: \"kubernetes.io/projected/c93f0461-fbb7-4446-99e2-d58df0038b16-kube-api-access-zltb9\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.341726 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.344438 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data-custom\") pod \"barbican-api-7545f98668-s6xmj\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.410419 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36","Type":"ContainerStarted","Data":"3acff3975fa12250f8f9f134e07ffc456f32c94d5176a5472fb6c7b426065fbb"} Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.459644 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.460696 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-768454b56f-84xc8" event={"ID":"305e96f8-0597-4c64-9026-6d6f2aa454d4","Type":"ContainerStarted","Data":"7e4c8d5d756ae1f1f79baf6ab39a0885d96760214ce8bd8adfe3bf8d0265ce4e"} Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.480336 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qk422" event={"ID":"f960833c-3f07-4613-8d69-b7c563b3dd5d","Type":"ContainerStarted","Data":"943e338fab2cd272e835a178a99551fb42c61777d5c489b740c986254d67e962"} Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.506086 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.511126 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qk422" podStartSLOduration=3.8381531559999997 podStartE2EDuration="44.51109587s" podCreationTimestamp="2025-12-01 10:22:18 +0000 UTC" firstStartedPulling="2025-12-01 10:22:19.750704443 +0000 UTC m=+1387.259493480" lastFinishedPulling="2025-12-01 10:23:00.423647167 +0000 UTC m=+1427.932436194" observedRunningTime="2025-12-01 10:23:02.503560903 +0000 UTC m=+1430.012349940" watchObservedRunningTime="2025-12-01 10:23:02.51109587 +0000 UTC m=+1430.019884907" Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.967061 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d6888456-hv67t"] Dec 01 10:23:02 crc kubenswrapper[4958]: I1201 10:23:02.979093 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-85fc6f9f59-gdn47"] Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.163015 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f6877488-9nzb2"] Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.342947 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd"] Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.442930 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kpf2"] Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.555338 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6888456-hv67t" event={"ID":"207148af-4b76-49b6-80cc-883ec14bb268","Type":"ContainerStarted","Data":"e8b4102e8e9184f8886030f4534753e5545e5fb8d5a2bf64b552ec540db373a3"} Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.562065 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f6877488-9nzb2" event={"ID":"caf6f35f-9e82-4899-801f-3b5e94189f6d","Type":"ContainerStarted","Data":"b3d19fa61514911c622a80f5db474f7b28764084c4fc66e35c69fb5f73b5a004"} Dec 01 10:23:03 crc kubenswrapper[4958]: W1201 10:23:03.562770 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ac54227_1acb_4bbb_8acb_5f7bba007274.slice/crio-57e54fe237becf76dbbbf8c50414668dedfc7108dac8f4b05d59e6cf79665494 WatchSource:0}: Error finding container 57e54fe237becf76dbbbf8c50414668dedfc7108dac8f4b05d59e6cf79665494: Status 404 returned error can't find the container with id 57e54fe237becf76dbbbf8c50414668dedfc7108dac8f4b05d59e6cf79665494 Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.573458 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0a0701a-31c0-41fb-ba17-1919483e6248","Type":"ContainerStarted","Data":"229d8f5adb170e7f4f64563c1a6b88f5847b5a0c0a31393efebbb413c3b7d7ea"} Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.593797 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-768454b56f-84xc8" event={"ID":"305e96f8-0597-4c64-9026-6d6f2aa454d4","Type":"ContainerStarted","Data":"b1011c74e73144076e25dd1c4df809bab366b279b9f967a7d28bfcadc81ae769"} Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.594083 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.599148 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7545f98668-s6xmj"] Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.617263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" event={"ID":"ceca985e-bec2-42fe-9758-edad828586c2","Type":"ContainerStarted","Data":"ee16d5363237e063ef2898c63d232312cd6a180134043c69dc9de164d7e26bee"} Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.632886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85fc6f9f59-gdn47" event={"ID":"81c0931c-8919-4128-97bb-21c5872d5cf0","Type":"ContainerStarted","Data":"e01ee9d2f2724ce03bb863b7e906c8b8b943bc2e3e319467052a9568c176065d"} Dec 01 10:23:03 crc kubenswrapper[4958]: I1201 10:23:03.636775 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-768454b56f-84xc8" podStartSLOduration=9.636745218 podStartE2EDuration="9.636745218s" podCreationTimestamp="2025-12-01 10:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:03.623900409 +0000 UTC m=+1431.132689446" watchObservedRunningTime="2025-12-01 10:23:03.636745218 +0000 UTC m=+1431.145534255" Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.664901 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36","Type":"ContainerStarted","Data":"8b863f0e6c8d8b91a9d5704cb7d22e94530c2769f2f5ebdea63f43036fdbe22a"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.673479 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6888456-hv67t" event={"ID":"207148af-4b76-49b6-80cc-883ec14bb268","Type":"ContainerStarted","Data":"944f885a93b277fd5b446c26c7b9c7db505641ce99d66fc0eee7275265da1587"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.673547 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6888456-hv67t" event={"ID":"207148af-4b76-49b6-80cc-883ec14bb268","Type":"ContainerStarted","Data":"a5061e1758cadcbddcf9d7fd1235664eaefd5fcc99c2a0707ae4e34496238238"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.674931 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.675220 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.683866 4958 generic.go:334] "Generic (PLEG): container finished" podID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerID="a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836" exitCode=0 Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.683998 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" event={"ID":"2ac54227-1acb-4bbb-8acb-5f7bba007274","Type":"ContainerDied","Data":"a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.684035 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" event={"ID":"2ac54227-1acb-4bbb-8acb-5f7bba007274","Type":"ContainerStarted","Data":"57e54fe237becf76dbbbf8c50414668dedfc7108dac8f4b05d59e6cf79665494"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.763556 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8d6888456-hv67t" podStartSLOduration=3.76352259 podStartE2EDuration="3.76352259s" podCreationTimestamp="2025-12-01 10:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:04.753175772 +0000 UTC m=+1432.261964829" watchObservedRunningTime="2025-12-01 10:23:04.76352259 +0000 UTC m=+1432.272311627" Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.766378 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f6877488-9nzb2" event={"ID":"caf6f35f-9e82-4899-801f-3b5e94189f6d","Type":"ContainerStarted","Data":"f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.766456 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.809256 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0a0701a-31c0-41fb-ba17-1919483e6248","Type":"ContainerStarted","Data":"33988a8e2a0e68eb0848002243b370adc826f5db90d39158152d4c45fdea722a"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.938914 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7545f98668-s6xmj" event={"ID":"c93f0461-fbb7-4446-99e2-d58df0038b16","Type":"ContainerStarted","Data":"2c9baee21cf0dd0a36c2ffd07d4ac94d36e9a976039c6ebfb865c18f327b475c"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.939007 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7545f98668-s6xmj" event={"ID":"c93f0461-fbb7-4446-99e2-d58df0038b16","Type":"ContainerStarted","Data":"be0a0d7fb8fd5c9609451f9e8e0853556ed584f83f54c2de8f9b00736f34e31b"} Dec 01 10:23:04 crc kubenswrapper[4958]: I1201 10:23:04.940858 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5f6877488-9nzb2" podStartSLOduration=3.9408063909999997 podStartE2EDuration="3.940806391s" podCreationTimestamp="2025-12-01 10:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:04.884733907 +0000 UTC m=+1432.393522954" watchObservedRunningTime="2025-12-01 10:23:04.940806391 +0000 UTC m=+1432.449595428" Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.008764 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.008738495 podStartE2EDuration="12.008738495s" podCreationTimestamp="2025-12-01 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:04.93279456 +0000 UTC m=+1432.441583617" watchObservedRunningTime="2025-12-01 10:23:05.008738495 +0000 UTC m=+1432.517527532" Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.956073 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7545f98668-s6xmj" event={"ID":"c93f0461-fbb7-4446-99e2-d58df0038b16","Type":"ContainerStarted","Data":"798f3cd60d0490245b6acf53ddd3a10033120f92391a8807099d05badd95931c"} Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.957254 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.957281 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.959533 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36","Type":"ContainerStarted","Data":"464ae8b776e0c5220b16bba7022f58ec4374bbec97ffa730004a713c7a084c74"} Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.966863 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" event={"ID":"2ac54227-1acb-4bbb-8acb-5f7bba007274","Type":"ContainerStarted","Data":"79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2"} Dec 01 10:23:05 crc kubenswrapper[4958]: I1201 10:23:05.968055 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.001475 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7545f98668-s6xmj" podStartSLOduration=5.001436689 podStartE2EDuration="5.001436689s" podCreationTimestamp="2025-12-01 10:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:05.98307324 +0000 UTC m=+1433.491862277" watchObservedRunningTime="2025-12-01 10:23:06.001436689 +0000 UTC m=+1433.510225726" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.029604 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.029577898 podStartE2EDuration="13.029577898s" podCreationTimestamp="2025-12-01 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:06.011393235 +0000 UTC m=+1433.520182272" watchObservedRunningTime="2025-12-01 10:23:06.029577898 +0000 UTC m=+1433.538366935" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.042460 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" podStartSLOduration=5.042432938 podStartE2EDuration="5.042432938s" podCreationTimestamp="2025-12-01 10:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:06.033726138 +0000 UTC m=+1433.542515185" watchObservedRunningTime="2025-12-01 10:23:06.042432938 +0000 UTC m=+1433.551221975" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.716554 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-744fbbd578-7c5pc"] Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.722951 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.727783 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.728088 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.752897 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-744fbbd578-7c5pc"] Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.819701 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data-custom\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.819869 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-internal-tls-certs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.819957 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8szj4\" (UniqueName: \"kubernetes.io/projected/b727a711-6b0b-44c6-917a-602f10dd0d6c-kube-api-access-8szj4\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.820231 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b727a711-6b0b-44c6-917a-602f10dd0d6c-logs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.820418 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-combined-ca-bundle\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.820558 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.820622 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-public-tls-certs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923343 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b727a711-6b0b-44c6-917a-602f10dd0d6c-logs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923502 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-combined-ca-bundle\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923584 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923626 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-public-tls-certs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923749 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data-custom\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923782 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-internal-tls-certs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.923811 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8szj4\" (UniqueName: \"kubernetes.io/projected/b727a711-6b0b-44c6-917a-602f10dd0d6c-kube-api-access-8szj4\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.926314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b727a711-6b0b-44c6-917a-602f10dd0d6c-logs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.932002 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-combined-ca-bundle\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.933903 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-public-tls-certs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.934175 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data-custom\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.934430 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.936834 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-internal-tls-certs\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:06 crc kubenswrapper[4958]: I1201 10:23:06.958482 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8szj4\" (UniqueName: \"kubernetes.io/projected/b727a711-6b0b-44c6-917a-602f10dd0d6c-kube-api-access-8szj4\") pod \"barbican-api-744fbbd578-7c5pc\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:07 crc kubenswrapper[4958]: I1201 10:23:07.068295 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:07 crc kubenswrapper[4958]: I1201 10:23:07.823073 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-744fbbd578-7c5pc"] Dec 01 10:23:07 crc kubenswrapper[4958]: I1201 10:23:07.997301 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" event={"ID":"ceca985e-bec2-42fe-9758-edad828586c2","Type":"ContainerStarted","Data":"beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb"} Dec 01 10:23:07 crc kubenswrapper[4958]: I1201 10:23:07.998583 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" event={"ID":"ceca985e-bec2-42fe-9758-edad828586c2","Type":"ContainerStarted","Data":"d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96"} Dec 01 10:23:08 crc kubenswrapper[4958]: I1201 10:23:08.000712 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85fc6f9f59-gdn47" event={"ID":"81c0931c-8919-4128-97bb-21c5872d5cf0","Type":"ContainerStarted","Data":"846a70f76927873aca913fa654d135f5d9f7402003f4d45aa4fb853953fe65db"} Dec 01 10:23:08 crc kubenswrapper[4958]: I1201 10:23:08.000795 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85fc6f9f59-gdn47" event={"ID":"81c0931c-8919-4128-97bb-21c5872d5cf0","Type":"ContainerStarted","Data":"0e6afa27cac7b16ea18ffb082028d7d4a5d1af380112b0080ebda5f523c8d154"} Dec 01 10:23:08 crc kubenswrapper[4958]: I1201 10:23:08.004796 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744fbbd578-7c5pc" event={"ID":"b727a711-6b0b-44c6-917a-602f10dd0d6c","Type":"ContainerStarted","Data":"8028a8027b503ff44704b40b230744482e49d1075b14eaa6e9f6e26b94442df3"} Dec 01 10:23:08 crc kubenswrapper[4958]: I1201 10:23:08.045470 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" podStartSLOduration=3.430925479 podStartE2EDuration="7.045442051s" podCreationTimestamp="2025-12-01 10:23:01 +0000 UTC" firstStartedPulling="2025-12-01 10:23:03.580196841 +0000 UTC m=+1431.088985878" lastFinishedPulling="2025-12-01 10:23:07.194713412 +0000 UTC m=+1434.703502450" observedRunningTime="2025-12-01 10:23:08.024794217 +0000 UTC m=+1435.533583264" watchObservedRunningTime="2025-12-01 10:23:08.045442051 +0000 UTC m=+1435.554231088" Dec 01 10:23:08 crc kubenswrapper[4958]: I1201 10:23:08.067133 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-85fc6f9f59-gdn47" podStartSLOduration=3.139353 podStartE2EDuration="7.067103994s" podCreationTimestamp="2025-12-01 10:23:01 +0000 UTC" firstStartedPulling="2025-12-01 10:23:03.269213433 +0000 UTC m=+1430.778002470" lastFinishedPulling="2025-12-01 10:23:07.196964427 +0000 UTC m=+1434.705753464" observedRunningTime="2025-12-01 10:23:08.054874142 +0000 UTC m=+1435.563663189" watchObservedRunningTime="2025-12-01 10:23:08.067103994 +0000 UTC m=+1435.575893031" Dec 01 10:23:09 crc kubenswrapper[4958]: I1201 10:23:09.026290 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744fbbd578-7c5pc" event={"ID":"b727a711-6b0b-44c6-917a-602f10dd0d6c","Type":"ContainerStarted","Data":"8d7386e6d905cc7e682cbc9fcbfef1b78be385a0409c687cf999a6495aba6952"} Dec 01 10:23:09 crc kubenswrapper[4958]: I1201 10:23:09.026683 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744fbbd578-7c5pc" event={"ID":"b727a711-6b0b-44c6-917a-602f10dd0d6c","Type":"ContainerStarted","Data":"2c69fba39bcc23fd51b0e44fe893c7a695b06edee1de7171ac930180b17f1ebe"} Dec 01 10:23:09 crc kubenswrapper[4958]: I1201 10:23:09.026729 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:09 crc kubenswrapper[4958]: I1201 10:23:09.026768 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:09 crc kubenswrapper[4958]: I1201 10:23:09.057452 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-744fbbd578-7c5pc" podStartSLOduration=3.057429239 podStartE2EDuration="3.057429239s" podCreationTimestamp="2025-12-01 10:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:09.05535633 +0000 UTC m=+1436.564145367" watchObservedRunningTime="2025-12-01 10:23:09.057429239 +0000 UTC m=+1436.566218276" Dec 01 10:23:10 crc kubenswrapper[4958]: I1201 10:23:10.036306 4958 generic.go:334] "Generic (PLEG): container finished" podID="f960833c-3f07-4613-8d69-b7c563b3dd5d" containerID="943e338fab2cd272e835a178a99551fb42c61777d5c489b740c986254d67e962" exitCode=0 Dec 01 10:23:10 crc kubenswrapper[4958]: I1201 10:23:10.037686 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qk422" event={"ID":"f960833c-3f07-4613-8d69-b7c563b3dd5d","Type":"ContainerDied","Data":"943e338fab2cd272e835a178a99551fb42c61777d5c489b740c986254d67e962"} Dec 01 10:23:12 crc kubenswrapper[4958]: I1201 10:23:12.462020 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:12 crc kubenswrapper[4958]: I1201 10:23:12.541710 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-q7fvg"] Dec 01 10:23:12 crc kubenswrapper[4958]: I1201 10:23:12.542855 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="dnsmasq-dns" containerID="cri-o://88c650102fea04cd849b5aec8074db6efd829b1782dfbc53cab75f51c83e96f7" gracePeriod=10 Dec 01 10:23:13 crc kubenswrapper[4958]: I1201 10:23:13.116791 4958 generic.go:334] "Generic (PLEG): container finished" podID="d0ac5474-7911-4334-9a06-760d22e9d524" containerID="88c650102fea04cd849b5aec8074db6efd829b1782dfbc53cab75f51c83e96f7" exitCode=0 Dec 01 10:23:13 crc kubenswrapper[4958]: I1201 10:23:13.116892 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" event={"ID":"d0ac5474-7911-4334-9a06-760d22e9d524","Type":"ContainerDied","Data":"88c650102fea04cd849b5aec8074db6efd829b1782dfbc53cab75f51c83e96f7"} Dec 01 10:23:13 crc kubenswrapper[4958]: I1201 10:23:13.734862 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:13 crc kubenswrapper[4958]: I1201 10:23:13.736168 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:13 crc kubenswrapper[4958]: I1201 10:23:13.784000 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:13 crc kubenswrapper[4958]: I1201 10:23:13.786752 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.057168 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.057255 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.107974 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.121379 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.130091 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.130156 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.130171 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.130186 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:14 crc kubenswrapper[4958]: I1201 10:23:14.458013 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Dec 01 10:23:15 crc kubenswrapper[4958]: I1201 10:23:15.097465 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:15 crc kubenswrapper[4958]: I1201 10:23:15.187886 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:16 crc kubenswrapper[4958]: I1201 10:23:16.159051 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:23:16 crc kubenswrapper[4958]: I1201 10:23:16.159092 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.063947 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qk422" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.216334 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qk422" event={"ID":"f960833c-3f07-4613-8d69-b7c563b3dd5d","Type":"ContainerDied","Data":"e375a610c469db4dff9fe7d098ca72f4e065b4ddc97556d46fb8427ea8e02343"} Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.216783 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e375a610c469db4dff9fe7d098ca72f4e065b4ddc97556d46fb8427ea8e02343" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.216902 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qk422" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.255407 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-scripts\") pod \"f960833c-3f07-4613-8d69-b7c563b3dd5d\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.255550 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f960833c-3f07-4613-8d69-b7c563b3dd5d-etc-machine-id\") pod \"f960833c-3f07-4613-8d69-b7c563b3dd5d\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.255602 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-config-data\") pod \"f960833c-3f07-4613-8d69-b7c563b3dd5d\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.255661 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-db-sync-config-data\") pod \"f960833c-3f07-4613-8d69-b7c563b3dd5d\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.255740 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-combined-ca-bundle\") pod \"f960833c-3f07-4613-8d69-b7c563b3dd5d\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.255800 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78nml\" (UniqueName: \"kubernetes.io/projected/f960833c-3f07-4613-8d69-b7c563b3dd5d-kube-api-access-78nml\") pod \"f960833c-3f07-4613-8d69-b7c563b3dd5d\" (UID: \"f960833c-3f07-4613-8d69-b7c563b3dd5d\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.270021 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f960833c-3f07-4613-8d69-b7c563b3dd5d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f960833c-3f07-4613-8d69-b7c563b3dd5d" (UID: "f960833c-3f07-4613-8d69-b7c563b3dd5d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.279116 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f960833c-3f07-4613-8d69-b7c563b3dd5d-kube-api-access-78nml" (OuterVolumeSpecName: "kube-api-access-78nml") pod "f960833c-3f07-4613-8d69-b7c563b3dd5d" (UID: "f960833c-3f07-4613-8d69-b7c563b3dd5d"). InnerVolumeSpecName "kube-api-access-78nml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.281445 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-scripts" (OuterVolumeSpecName: "scripts") pod "f960833c-3f07-4613-8d69-b7c563b3dd5d" (UID: "f960833c-3f07-4613-8d69-b7c563b3dd5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.295267 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f960833c-3f07-4613-8d69-b7c563b3dd5d" (UID: "f960833c-3f07-4613-8d69-b7c563b3dd5d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.364172 4958 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.364232 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78nml\" (UniqueName: \"kubernetes.io/projected/f960833c-3f07-4613-8d69-b7c563b3dd5d-kube-api-access-78nml\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.364252 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.364261 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f960833c-3f07-4613-8d69-b7c563b3dd5d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.372547 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.419021 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f960833c-3f07-4613-8d69-b7c563b3dd5d" (UID: "f960833c-3f07-4613-8d69-b7c563b3dd5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.432240 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.432472 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:23:17 crc kubenswrapper[4958]: E1201 10:23:17.435963 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="62be9aa8-5618-470f-990a-448f46a926cf" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.466582 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-swift-storage-0\") pod \"d0ac5474-7911-4334-9a06-760d22e9d524\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.467244 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-przrv\" (UniqueName: \"kubernetes.io/projected/d0ac5474-7911-4334-9a06-760d22e9d524-kube-api-access-przrv\") pod \"d0ac5474-7911-4334-9a06-760d22e9d524\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.467340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-svc\") pod \"d0ac5474-7911-4334-9a06-760d22e9d524\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.467556 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-config\") pod \"d0ac5474-7911-4334-9a06-760d22e9d524\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.467749 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-nb\") pod \"d0ac5474-7911-4334-9a06-760d22e9d524\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.467810 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-sb\") pod \"d0ac5474-7911-4334-9a06-760d22e9d524\" (UID: \"d0ac5474-7911-4334-9a06-760d22e9d524\") " Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.468653 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.493258 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0ac5474-7911-4334-9a06-760d22e9d524-kube-api-access-przrv" (OuterVolumeSpecName: "kube-api-access-przrv") pod "d0ac5474-7911-4334-9a06-760d22e9d524" (UID: "d0ac5474-7911-4334-9a06-760d22e9d524"). InnerVolumeSpecName "kube-api-access-przrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.512086 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-config-data" (OuterVolumeSpecName: "config-data") pod "f960833c-3f07-4613-8d69-b7c563b3dd5d" (UID: "f960833c-3f07-4613-8d69-b7c563b3dd5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.551779 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d0ac5474-7911-4334-9a06-760d22e9d524" (UID: "d0ac5474-7911-4334-9a06-760d22e9d524"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.568493 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0ac5474-7911-4334-9a06-760d22e9d524" (UID: "d0ac5474-7911-4334-9a06-760d22e9d524"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.574061 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-przrv\" (UniqueName: \"kubernetes.io/projected/d0ac5474-7911-4334-9a06-760d22e9d524-kube-api-access-przrv\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.574110 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.574126 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f960833c-3f07-4613-8d69-b7c563b3dd5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.574140 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.581612 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-config" (OuterVolumeSpecName: "config") pod "d0ac5474-7911-4334-9a06-760d22e9d524" (UID: "d0ac5474-7911-4334-9a06-760d22e9d524"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.582894 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0ac5474-7911-4334-9a06-760d22e9d524" (UID: "d0ac5474-7911-4334-9a06-760d22e9d524"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.592519 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0ac5474-7911-4334-9a06-760d22e9d524" (UID: "d0ac5474-7911-4334-9a06-760d22e9d524"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.676285 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.676327 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.676344 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0ac5474-7911-4334-9a06-760d22e9d524-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.707682 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:17 crc kubenswrapper[4958]: I1201 10:23:17.707809 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.002705 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.051300 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.234163 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerStarted","Data":"0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543"} Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.234314 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="ceilometer-notification-agent" containerID="cri-o://00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d" gracePeriod=30 Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.234428 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.234469 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="sg-core" containerID="cri-o://b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275" gracePeriod=30 Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.234431 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="proxy-httpd" containerID="cri-o://0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543" gracePeriod=30 Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.254122 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.254615 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-q7fvg" event={"ID":"d0ac5474-7911-4334-9a06-760d22e9d524","Type":"ContainerDied","Data":"7ba77042a730599f93e4bce1f5c9db6158e3a0737e9e08ac822d935a0a8436ff"} Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.254655 4958 scope.go:117] "RemoveContainer" containerID="88c650102fea04cd849b5aec8074db6efd829b1782dfbc53cab75f51c83e96f7" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.304136 4958 scope.go:117] "RemoveContainer" containerID="b8c6d02f07e1ac6fb7fb78825570ffb5d59f66d76818bc92450f9f78d930a90a" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.326930 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-q7fvg"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.364574 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-q7fvg"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.546075 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:18 crc kubenswrapper[4958]: E1201 10:23:18.548690 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f960833c-3f07-4613-8d69-b7c563b3dd5d" containerName="cinder-db-sync" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.548715 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f960833c-3f07-4613-8d69-b7c563b3dd5d" containerName="cinder-db-sync" Dec 01 10:23:18 crc kubenswrapper[4958]: E1201 10:23:18.548754 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="init" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.548765 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="init" Dec 01 10:23:18 crc kubenswrapper[4958]: E1201 10:23:18.548784 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="dnsmasq-dns" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.548793 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="dnsmasq-dns" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.567562 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f960833c-3f07-4613-8d69-b7c563b3dd5d" containerName="cinder-db-sync" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.567645 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" containerName="dnsmasq-dns" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.570272 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.576396 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.576713 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.579476 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.596352 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bbpvk" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.683197 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.723147 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v4zqt"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.725553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.743239 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.743303 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.743334 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.743360 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.743416 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.743449 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnmds\" (UniqueName: \"kubernetes.io/projected/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-kube-api-access-gnmds\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.811968 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v4zqt"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.832408 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.835988 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846439 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846543 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846580 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-config\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846612 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846721 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846766 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846813 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnmds\" (UniqueName: \"kubernetes.io/projected/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-kube-api-access-gnmds\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846835 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846884 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxjb7\" (UniqueName: \"kubernetes.io/projected/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-kube-api-access-nxjb7\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.846925 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.847033 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.849243 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.853053 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.858890 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.860591 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.869186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.880459 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.884556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnmds\" (UniqueName: \"kubernetes.io/projected/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-kube-api-access-gnmds\") pod \"cinder-scheduler-0\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.902988 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.941180 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948655 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82fc053-afd2-45aa-86ef-fa58c3e849c2-logs\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948732 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxjb7\" (UniqueName: \"kubernetes.io/projected/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-kube-api-access-nxjb7\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948792 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948813 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948914 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948970 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7bjz\" (UniqueName: \"kubernetes.io/projected/a82fc053-afd2-45aa-86ef-fa58c3e849c2-kube-api-access-l7bjz\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.948993 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-scripts\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.949015 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-config\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.949078 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.949125 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.949142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a82fc053-afd2-45aa-86ef-fa58c3e849c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.951132 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.952476 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.953116 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.954211 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-config\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.959027 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:18 crc kubenswrapper[4958]: I1201 10:23:18.971763 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxjb7\" (UniqueName: \"kubernetes.io/projected/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-kube-api-access-nxjb7\") pod \"dnsmasq-dns-5c9776ccc5-v4zqt\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.051675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.052280 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.052344 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7bjz\" (UniqueName: \"kubernetes.io/projected/a82fc053-afd2-45aa-86ef-fa58c3e849c2-kube-api-access-l7bjz\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.052384 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-scripts\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.052490 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.052518 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a82fc053-afd2-45aa-86ef-fa58c3e849c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.052564 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82fc053-afd2-45aa-86ef-fa58c3e849c2-logs\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.053459 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82fc053-afd2-45aa-86ef-fa58c3e849c2-logs\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.053548 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a82fc053-afd2-45aa-86ef-fa58c3e849c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.060446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.064189 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.080568 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-scripts\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.090754 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.093376 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.106720 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7bjz\" (UniqueName: \"kubernetes.io/projected/a82fc053-afd2-45aa-86ef-fa58c3e849c2-kube-api-access-l7bjz\") pod \"cinder-api-0\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.118263 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.276208 4958 generic.go:334] "Generic (PLEG): container finished" podID="62be9aa8-5618-470f-990a-448f46a926cf" containerID="0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543" exitCode=0 Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.276265 4958 generic.go:334] "Generic (PLEG): container finished" podID="62be9aa8-5618-470f-990a-448f46a926cf" containerID="b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275" exitCode=2 Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.276328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerDied","Data":"0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543"} Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.276373 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerDied","Data":"b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275"} Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.506510 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.732139 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.822825 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0ac5474-7911-4334-9a06-760d22e9d524" path="/var/lib/kubelet/pods/d0ac5474-7911-4334-9a06-760d22e9d524/volumes" Dec 01 10:23:19 crc kubenswrapper[4958]: I1201 10:23:19.976922 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:20 crc kubenswrapper[4958]: W1201 10:23:20.028504 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda82fc053_afd2_45aa_86ef_fa58c3e849c2.slice/crio-1bbbec250221cd8f5d3151e81bde35ff4927f6af5cbc3aa0a402d98ce808e890 WatchSource:0}: Error finding container 1bbbec250221cd8f5d3151e81bde35ff4927f6af5cbc3aa0a402d98ce808e890: Status 404 returned error can't find the container with id 1bbbec250221cd8f5d3151e81bde35ff4927f6af5cbc3aa0a402d98ce808e890 Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.065485 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v4zqt"] Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.206651 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.324006 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3","Type":"ContainerStarted","Data":"279ee34b8262d8e61ed7519ea3a4e2251082f941fab4186b62278704f17b64f1"} Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.361320 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" event={"ID":"7e2e8cef-b53d-4690-8d80-0a05cc2f3905","Type":"ContainerStarted","Data":"0da1c7417112852f1973f939b9d47f642ec869749bae54733c95ea7d761fc752"} Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.392749 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a82fc053-afd2-45aa-86ef-fa58c3e849c2","Type":"ContainerStarted","Data":"1bbbec250221cd8f5d3151e81bde35ff4927f6af5cbc3aa0a402d98ce808e890"} Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.755415 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.841096 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7545f98668-s6xmj"] Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.841536 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7545f98668-s6xmj" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api-log" containerID="cri-o://2c9baee21cf0dd0a36c2ffd07d4ac94d36e9a976039c6ebfb865c18f327b475c" gracePeriod=30 Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.841959 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7545f98668-s6xmj" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api" containerID="cri-o://798f3cd60d0490245b6acf53ddd3a10033120f92391a8807099d05badd95931c" gracePeriod=30 Dec 01 10:23:20 crc kubenswrapper[4958]: I1201 10:23:20.862939 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7545f98668-s6xmj" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": EOF" Dec 01 10:23:21 crc kubenswrapper[4958]: I1201 10:23:21.442328 4958 generic.go:334] "Generic (PLEG): container finished" podID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerID="69ca25c6ecb718cd720794af4ec2982911438faa56d74bf694f4173e0735c7c4" exitCode=0 Dec 01 10:23:21 crc kubenswrapper[4958]: I1201 10:23:21.442969 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" event={"ID":"7e2e8cef-b53d-4690-8d80-0a05cc2f3905","Type":"ContainerDied","Data":"69ca25c6ecb718cd720794af4ec2982911438faa56d74bf694f4173e0735c7c4"} Dec 01 10:23:21 crc kubenswrapper[4958]: I1201 10:23:21.461295 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a82fc053-afd2-45aa-86ef-fa58c3e849c2","Type":"ContainerStarted","Data":"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c"} Dec 01 10:23:21 crc kubenswrapper[4958]: I1201 10:23:21.487314 4958 generic.go:334] "Generic (PLEG): container finished" podID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerID="2c9baee21cf0dd0a36c2ffd07d4ac94d36e9a976039c6ebfb865c18f327b475c" exitCode=143 Dec 01 10:23:21 crc kubenswrapper[4958]: I1201 10:23:21.487401 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7545f98668-s6xmj" event={"ID":"c93f0461-fbb7-4446-99e2-d58df0038b16","Type":"ContainerDied","Data":"2c9baee21cf0dd0a36c2ffd07d4ac94d36e9a976039c6ebfb865c18f327b475c"} Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.335882 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.486821 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-log-httpd\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.486995 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrnzm\" (UniqueName: \"kubernetes.io/projected/62be9aa8-5618-470f-990a-448f46a926cf-kube-api-access-zrnzm\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.487087 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-combined-ca-bundle\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.487113 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-run-httpd\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.487192 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-sg-core-conf-yaml\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.487254 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-config-data\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.487389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-scripts\") pod \"62be9aa8-5618-470f-990a-448f46a926cf\" (UID: \"62be9aa8-5618-470f-990a-448f46a926cf\") " Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.493222 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.493726 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.506615 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62be9aa8-5618-470f-990a-448f46a926cf-kube-api-access-zrnzm" (OuterVolumeSpecName: "kube-api-access-zrnzm") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "kube-api-access-zrnzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.506779 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-scripts" (OuterVolumeSpecName: "scripts") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.539832 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.551681 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" event={"ID":"7e2e8cef-b53d-4690-8d80-0a05cc2f3905","Type":"ContainerStarted","Data":"9cd77135c9dae4d1d8605178d1a6e703cb37d33184ff45251860dbf6725f2a91"} Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.552974 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.558251 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a82fc053-afd2-45aa-86ef-fa58c3e849c2","Type":"ContainerStarted","Data":"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63"} Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.558915 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.567179 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.570580 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3","Type":"ContainerStarted","Data":"e4cedd3f84ca670706b3ada0f0b72ef56be1156fed7f4d24577329ae7796f8e8"} Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.587716 4958 generic.go:334] "Generic (PLEG): container finished" podID="62be9aa8-5618-470f-990a-448f46a926cf" containerID="00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d" exitCode=0 Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.587794 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerDied","Data":"00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d"} Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.587861 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62be9aa8-5618-470f-990a-448f46a926cf","Type":"ContainerDied","Data":"bc77f0f8dd60f6d0fe868e7ee78f7db2c867f12ba469932074ce6643205e195e"} Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.587893 4958 scope.go:117] "RemoveContainer" containerID="0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.587888 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.589945 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.589978 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrnzm\" (UniqueName: \"kubernetes.io/projected/62be9aa8-5618-470f-990a-448f46a926cf-kube-api-access-zrnzm\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.589992 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62be9aa8-5618-470f-990a-448f46a926cf-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.590003 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.590014 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.597029 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" podStartSLOduration=4.597000027 podStartE2EDuration="4.597000027s" podCreationTimestamp="2025-12-01 10:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:22.586217066 +0000 UTC m=+1450.095006103" watchObservedRunningTime="2025-12-01 10:23:22.597000027 +0000 UTC m=+1450.105789064" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.646638 4958 scope.go:117] "RemoveContainer" containerID="b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.662966 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.662939184 podStartE2EDuration="4.662939184s" podCreationTimestamp="2025-12-01 10:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:22.630697636 +0000 UTC m=+1450.139486683" watchObservedRunningTime="2025-12-01 10:23:22.662939184 +0000 UTC m=+1450.171728221" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.666760 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-config-data" (OuterVolumeSpecName: "config-data") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.668307 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62be9aa8-5618-470f-990a-448f46a926cf" (UID: "62be9aa8-5618-470f-990a-448f46a926cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.693672 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.693726 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62be9aa8-5618-470f-990a-448f46a926cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.706674 4958 scope.go:117] "RemoveContainer" containerID="00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.739018 4958 scope.go:117] "RemoveContainer" containerID="0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543" Dec 01 10:23:22 crc kubenswrapper[4958]: E1201 10:23:22.741070 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543\": container with ID starting with 0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543 not found: ID does not exist" containerID="0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.741147 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543"} err="failed to get container status \"0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543\": rpc error: code = NotFound desc = could not find container \"0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543\": container with ID starting with 0be5da95d523d4688f97c6c6d80b61af66811292e61f7ed946307a05183db543 not found: ID does not exist" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.741201 4958 scope.go:117] "RemoveContainer" containerID="b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275" Dec 01 10:23:22 crc kubenswrapper[4958]: E1201 10:23:22.747165 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275\": container with ID starting with b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275 not found: ID does not exist" containerID="b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.747221 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275"} err="failed to get container status \"b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275\": rpc error: code = NotFound desc = could not find container \"b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275\": container with ID starting with b2370710dbb51466cc749b9978dd4f16bc4f2f3a1e9e466c2fa25742b461d275 not found: ID does not exist" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.747256 4958 scope.go:117] "RemoveContainer" containerID="00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d" Dec 01 10:23:22 crc kubenswrapper[4958]: E1201 10:23:22.763878 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d\": container with ID starting with 00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d not found: ID does not exist" containerID="00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d" Dec 01 10:23:22 crc kubenswrapper[4958]: I1201 10:23:22.763938 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d"} err="failed to get container status \"00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d\": rpc error: code = NotFound desc = could not find container \"00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d\": container with ID starting with 00ef342e3a064c635a6542e8f1a996239ceaac9038dc1cf0ab933766181e165d not found: ID does not exist" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.038910 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.155916 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.193513 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:23 crc kubenswrapper[4958]: E1201 10:23:23.201005 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="ceilometer-notification-agent" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.201044 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="ceilometer-notification-agent" Dec 01 10:23:23 crc kubenswrapper[4958]: E1201 10:23:23.201056 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="sg-core" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.201063 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="sg-core" Dec 01 10:23:23 crc kubenswrapper[4958]: E1201 10:23:23.201101 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="proxy-httpd" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.201108 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="proxy-httpd" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.201324 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="ceilometer-notification-agent" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.201333 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="sg-core" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.201345 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="62be9aa8-5618-470f-990a-448f46a926cf" containerName="proxy-httpd" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.203579 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.212199 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.212484 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.234651 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.351917 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-config-data\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.352010 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-scripts\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.352063 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.352105 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.352343 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.352476 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.352520 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcl7f\" (UniqueName: \"kubernetes.io/projected/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-kube-api-access-dcl7f\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.454812 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.454907 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.454941 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.454966 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcl7f\" (UniqueName: \"kubernetes.io/projected/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-kube-api-access-dcl7f\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.455038 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-config-data\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.455075 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-scripts\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.455111 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.455638 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.456934 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.464627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.464802 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.465675 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-scripts\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.465794 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-config-data\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.477120 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcl7f\" (UniqueName: \"kubernetes.io/projected/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-kube-api-access-dcl7f\") pod \"ceilometer-0\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.538672 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.613393 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3","Type":"ContainerStarted","Data":"fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e"} Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.617959 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api-log" containerID="cri-o://04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c" gracePeriod=30 Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.618324 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api" containerID="cri-o://6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63" gracePeriod=30 Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.661540 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.626061934 podStartE2EDuration="5.661505496s" podCreationTimestamp="2025-12-01 10:23:18 +0000 UTC" firstStartedPulling="2025-12-01 10:23:19.733647259 +0000 UTC m=+1447.242436286" lastFinishedPulling="2025-12-01 10:23:20.769090811 +0000 UTC m=+1448.277879848" observedRunningTime="2025-12-01 10:23:23.641033837 +0000 UTC m=+1451.149822884" watchObservedRunningTime="2025-12-01 10:23:23.661505496 +0000 UTC m=+1451.170294533" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.811149 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62be9aa8-5618-470f-990a-448f46a926cf" path="/var/lib/kubelet/pods/62be9aa8-5618-470f-990a-448f46a926cf/volumes" Dec 01 10:23:23 crc kubenswrapper[4958]: I1201 10:23:23.943361 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.237098 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.359604 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7545f98668-s6xmj" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:48340->10.217.0.156:9311: read: connection reset by peer" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.359712 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7545f98668-s6xmj" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:48326->10.217.0.156:9311: read: connection reset by peer" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.587394 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.598486 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605088 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82fc053-afd2-45aa-86ef-fa58c3e849c2-logs\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605229 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-scripts\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605526 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7bjz\" (UniqueName: \"kubernetes.io/projected/a82fc053-afd2-45aa-86ef-fa58c3e849c2-kube-api-access-l7bjz\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605614 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a82fc053-afd2-45aa-86ef-fa58c3e849c2-etc-machine-id\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605664 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data-custom\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605698 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-combined-ca-bundle\") pod \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\" (UID: \"a82fc053-afd2-45aa-86ef-fa58c3e849c2\") " Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.605645 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a82fc053-afd2-45aa-86ef-fa58c3e849c2-logs" (OuterVolumeSpecName: "logs") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.606283 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a82fc053-afd2-45aa-86ef-fa58c3e849c2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.606470 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a82fc053-afd2-45aa-86ef-fa58c3e849c2-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.606503 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a82fc053-afd2-45aa-86ef-fa58c3e849c2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.620387 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82fc053-afd2-45aa-86ef-fa58c3e849c2-kube-api-access-l7bjz" (OuterVolumeSpecName: "kube-api-access-l7bjz") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "kube-api-access-l7bjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.620554 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-scripts" (OuterVolumeSpecName: "scripts") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.621371 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.651957 4958 generic.go:334] "Generic (PLEG): container finished" podID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerID="6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63" exitCode=0 Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.652000 4958 generic.go:334] "Generic (PLEG): container finished" podID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerID="04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c" exitCode=143 Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.652104 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a82fc053-afd2-45aa-86ef-fa58c3e849c2","Type":"ContainerDied","Data":"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63"} Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.652150 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a82fc053-afd2-45aa-86ef-fa58c3e849c2","Type":"ContainerDied","Data":"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c"} Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.652161 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a82fc053-afd2-45aa-86ef-fa58c3e849c2","Type":"ContainerDied","Data":"1bbbec250221cd8f5d3151e81bde35ff4927f6af5cbc3aa0a402d98ce808e890"} Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.652202 4958 scope.go:117] "RemoveContainer" containerID="6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.652522 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.655230 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerStarted","Data":"87f539ca23b337960dee907c8d7ef88109496c24f96ab8f43aa828a5968c5317"} Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.666959 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.690406 4958 generic.go:334] "Generic (PLEG): container finished" podID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerID="798f3cd60d0490245b6acf53ddd3a10033120f92391a8807099d05badd95931c" exitCode=0 Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.692246 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7545f98668-s6xmj" event={"ID":"c93f0461-fbb7-4446-99e2-d58df0038b16","Type":"ContainerDied","Data":"798f3cd60d0490245b6acf53ddd3a10033120f92391a8807099d05badd95931c"} Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.695466 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data" (OuterVolumeSpecName: "config-data") pod "a82fc053-afd2-45aa-86ef-fa58c3e849c2" (UID: "a82fc053-afd2-45aa-86ef-fa58c3e849c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.708109 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.708158 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.708172 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7bjz\" (UniqueName: \"kubernetes.io/projected/a82fc053-afd2-45aa-86ef-fa58c3e849c2-kube-api-access-l7bjz\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.708183 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.708193 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a82fc053-afd2-45aa-86ef-fa58c3e849c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.725572 4958 scope.go:117] "RemoveContainer" containerID="04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.747679 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bff445bfd-mg9pk"] Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.749376 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bff445bfd-mg9pk" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-api" containerID="cri-o://0800009f1f87e2b0e639aa1625f201a95856480de1a188eb073dc4dc47524d2a" gracePeriod=30 Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.751284 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bff445bfd-mg9pk" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-httpd" containerID="cri-o://c4e4a17cabd5aa2978dceb71175d7f962d7c7ec975783f5fd25c57bb34cf6b2f" gracePeriod=30 Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.772172 4958 scope.go:117] "RemoveContainer" containerID="6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63" Dec 01 10:23:24 crc kubenswrapper[4958]: E1201 10:23:24.772820 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63\": container with ID starting with 6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63 not found: ID does not exist" containerID="6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.772914 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63"} err="failed to get container status \"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63\": rpc error: code = NotFound desc = could not find container \"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63\": container with ID starting with 6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63 not found: ID does not exist" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.772962 4958 scope.go:117] "RemoveContainer" containerID="04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c" Dec 01 10:23:24 crc kubenswrapper[4958]: E1201 10:23:24.773355 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c\": container with ID starting with 04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c not found: ID does not exist" containerID="04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.773389 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c"} err="failed to get container status \"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c\": rpc error: code = NotFound desc = could not find container \"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c\": container with ID starting with 04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c not found: ID does not exist" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.773412 4958 scope.go:117] "RemoveContainer" containerID="6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.774497 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63"} err="failed to get container status \"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63\": rpc error: code = NotFound desc = could not find container \"6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63\": container with ID starting with 6d13f2d2b39d1f89a13a2bd14ab7e478d9d00f867bf9ec26c2ae51e071a52d63 not found: ID does not exist" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.774526 4958 scope.go:117] "RemoveContainer" containerID="04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c" Dec 01 10:23:24 crc kubenswrapper[4958]: I1201 10:23:24.775043 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c"} err="failed to get container status \"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c\": rpc error: code = NotFound desc = could not find container \"04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c\": container with ID starting with 04dddd8be1d48a32405350e6c2a53a9ba65f9c8a0fd42887cea154b25911963c not found: ID does not exist" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.118751 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.178008 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.189911 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.230480 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zltb9\" (UniqueName: \"kubernetes.io/projected/c93f0461-fbb7-4446-99e2-d58df0038b16-kube-api-access-zltb9\") pod \"c93f0461-fbb7-4446-99e2-d58df0038b16\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.231028 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data\") pod \"c93f0461-fbb7-4446-99e2-d58df0038b16\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.231201 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data-custom\") pod \"c93f0461-fbb7-4446-99e2-d58df0038b16\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.231276 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-combined-ca-bundle\") pod \"c93f0461-fbb7-4446-99e2-d58df0038b16\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.231477 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93f0461-fbb7-4446-99e2-d58df0038b16-logs\") pod \"c93f0461-fbb7-4446-99e2-d58df0038b16\" (UID: \"c93f0461-fbb7-4446-99e2-d58df0038b16\") " Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.233125 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c93f0461-fbb7-4446-99e2-d58df0038b16-logs" (OuterVolumeSpecName: "logs") pod "c93f0461-fbb7-4446-99e2-d58df0038b16" (UID: "c93f0461-fbb7-4446-99e2-d58df0038b16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.242965 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:25 crc kubenswrapper[4958]: E1201 10:23:25.243587 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api-log" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.243615 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api-log" Dec 01 10:23:25 crc kubenswrapper[4958]: E1201 10:23:25.243638 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.243646 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api" Dec 01 10:23:25 crc kubenswrapper[4958]: E1201 10:23:25.243673 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.243682 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api" Dec 01 10:23:25 crc kubenswrapper[4958]: E1201 10:23:25.243718 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api-log" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.243734 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api-log" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.243969 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c93f0461-fbb7-4446-99e2-d58df0038b16" (UID: "c93f0461-fbb7-4446-99e2-d58df0038b16"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.245449 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api-log" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.245482 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.245491 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" containerName="cinder-api-log" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.245512 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" containerName="barbican-api" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.246887 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.251605 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.251709 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.252122 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.257320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93f0461-fbb7-4446-99e2-d58df0038b16-kube-api-access-zltb9" (OuterVolumeSpecName: "kube-api-access-zltb9") pod "c93f0461-fbb7-4446-99e2-d58df0038b16" (UID: "c93f0461-fbb7-4446-99e2-d58df0038b16"). InnerVolumeSpecName "kube-api-access-zltb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.281405 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.321030 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c93f0461-fbb7-4446-99e2-d58df0038b16" (UID: "c93f0461-fbb7-4446-99e2-d58df0038b16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.339679 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tktm\" (UniqueName: \"kubernetes.io/projected/137d864e-34d9-452c-91b0-179a93198b0f-kube-api-access-4tktm\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.339738 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/137d864e-34d9-452c-91b0-179a93198b0f-logs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.339787 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-scripts\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.342163 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.343362 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.343439 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.344369 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.344627 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data-custom\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.345080 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/137d864e-34d9-452c-91b0-179a93198b0f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.345461 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93f0461-fbb7-4446-99e2-d58df0038b16-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.345482 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zltb9\" (UniqueName: \"kubernetes.io/projected/c93f0461-fbb7-4446-99e2-d58df0038b16-kube-api-access-zltb9\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.345500 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.345513 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.359790 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data" (OuterVolumeSpecName: "config-data") pod "c93f0461-fbb7-4446-99e2-d58df0038b16" (UID: "c93f0461-fbb7-4446-99e2-d58df0038b16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451004 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/137d864e-34d9-452c-91b0-179a93198b0f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451099 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tktm\" (UniqueName: \"kubernetes.io/projected/137d864e-34d9-452c-91b0-179a93198b0f-kube-api-access-4tktm\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451127 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/137d864e-34d9-452c-91b0-179a93198b0f-logs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451147 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-scripts\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451218 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451241 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451261 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451297 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451324 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data-custom\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.451385 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93f0461-fbb7-4446-99e2-d58df0038b16-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.452120 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/137d864e-34d9-452c-91b0-179a93198b0f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.453039 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/137d864e-34d9-452c-91b0-179a93198b0f-logs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.459225 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data-custom\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.462758 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.464409 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.464551 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-scripts\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.465602 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.466751 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.475759 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tktm\" (UniqueName: \"kubernetes.io/projected/137d864e-34d9-452c-91b0-179a93198b0f-kube-api-access-4tktm\") pod \"cinder-api-0\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.633978 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.706881 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7545f98668-s6xmj" event={"ID":"c93f0461-fbb7-4446-99e2-d58df0038b16","Type":"ContainerDied","Data":"be0a0d7fb8fd5c9609451f9e8e0853556ed584f83f54c2de8f9b00736f34e31b"} Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.707309 4958 scope.go:117] "RemoveContainer" containerID="798f3cd60d0490245b6acf53ddd3a10033120f92391a8807099d05badd95931c" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.707138 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7545f98668-s6xmj" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.713744 4958 generic.go:334] "Generic (PLEG): container finished" podID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerID="c4e4a17cabd5aa2978dceb71175d7f962d7c7ec975783f5fd25c57bb34cf6b2f" exitCode=0 Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.713955 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bff445bfd-mg9pk" event={"ID":"4777ce5e-b8d9-4486-9ebb-a8521c18dd15","Type":"ContainerDied","Data":"c4e4a17cabd5aa2978dceb71175d7f962d7c7ec975783f5fd25c57bb34cf6b2f"} Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.727302 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerStarted","Data":"04a9513e76310c8aa9a53be9080554fdae6410de5597747d6f820f942551f7ae"} Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.757065 4958 scope.go:117] "RemoveContainer" containerID="2c9baee21cf0dd0a36c2ffd07d4ac94d36e9a976039c6ebfb865c18f327b475c" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.774914 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7545f98668-s6xmj"] Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.792375 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7545f98668-s6xmj"] Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.816908 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a82fc053-afd2-45aa-86ef-fa58c3e849c2" path="/var/lib/kubelet/pods/a82fc053-afd2-45aa-86ef-fa58c3e849c2/volumes" Dec 01 10:23:25 crc kubenswrapper[4958]: I1201 10:23:25.818294 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93f0461-fbb7-4446-99e2-d58df0038b16" path="/var/lib/kubelet/pods/c93f0461-fbb7-4446-99e2-d58df0038b16/volumes" Dec 01 10:23:26 crc kubenswrapper[4958]: I1201 10:23:26.260966 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:23:26 crc kubenswrapper[4958]: W1201 10:23:26.284248 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137d864e_34d9_452c_91b0_179a93198b0f.slice/crio-243b5df147c66a046aab1ae08bb1828983fbea8e9dcb142a19c667c9ae774be5 WatchSource:0}: Error finding container 243b5df147c66a046aab1ae08bb1828983fbea8e9dcb142a19c667c9ae774be5: Status 404 returned error can't find the container with id 243b5df147c66a046aab1ae08bb1828983fbea8e9dcb142a19c667c9ae774be5 Dec 01 10:23:26 crc kubenswrapper[4958]: I1201 10:23:26.739370 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"137d864e-34d9-452c-91b0-179a93198b0f","Type":"ContainerStarted","Data":"243b5df147c66a046aab1ae08bb1828983fbea8e9dcb142a19c667c9ae774be5"} Dec 01 10:23:26 crc kubenswrapper[4958]: I1201 10:23:26.742171 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerStarted","Data":"fd7676d3d11d0fb0ddfceb74709863fdef5528c7a8543875b4942b203b552cf0"} Dec 01 10:23:27 crc kubenswrapper[4958]: I1201 10:23:27.760539 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerStarted","Data":"76790b81bc19590d0a5800b9660e76d2f0876a731e941b4080087e7baa5c0d9f"} Dec 01 10:23:27 crc kubenswrapper[4958]: I1201 10:23:27.772909 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"137d864e-34d9-452c-91b0-179a93198b0f","Type":"ContainerStarted","Data":"0384d0b99f1562b0d40019ea6259c6fc06fa88bdab2bd09c91cd469a8040914c"} Dec 01 10:23:28 crc kubenswrapper[4958]: I1201 10:23:28.801155 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"137d864e-34d9-452c-91b0-179a93198b0f","Type":"ContainerStarted","Data":"65331563d699edd62e898ec079b7208d63c4336ed333b4a2cad00c932ee16ea5"} Dec 01 10:23:28 crc kubenswrapper[4958]: I1201 10:23:28.801494 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 10:23:28 crc kubenswrapper[4958]: I1201 10:23:28.843648 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.843616662 podStartE2EDuration="3.843616662s" podCreationTimestamp="2025-12-01 10:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:28.834058237 +0000 UTC m=+1456.342847294" watchObservedRunningTime="2025-12-01 10:23:28.843616662 +0000 UTC m=+1456.352405699" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.093867 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.187899 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kpf2"] Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.188217 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerName="dnsmasq-dns" containerID="cri-o://79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2" gracePeriod=10 Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.235555 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.298823 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.646223 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.746864 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-svc\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.747059 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j6dh\" (UniqueName: \"kubernetes.io/projected/2ac54227-1acb-4bbb-8acb-5f7bba007274-kube-api-access-7j6dh\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.747145 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.747177 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-nb\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.747280 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-config\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.747324 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-swift-storage-0\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.753984 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac54227-1acb-4bbb-8acb-5f7bba007274-kube-api-access-7j6dh" (OuterVolumeSpecName: "kube-api-access-7j6dh") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "kube-api-access-7j6dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.834207 4958 generic.go:334] "Generic (PLEG): container finished" podID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerID="79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2" exitCode=0 Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.835230 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.837203 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.845696 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.849458 4958 generic.go:334] "Generic (PLEG): container finished" podID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerID="0800009f1f87e2b0e639aa1625f201a95856480de1a188eb073dc4dc47524d2a" exitCode=0 Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.849486 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.849978 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb\") pod \"2ac54227-1acb-4bbb-8acb-5f7bba007274\" (UID: \"2ac54227-1acb-4bbb-8acb-5f7bba007274\") " Dec 01 10:23:29 crc kubenswrapper[4958]: W1201 10:23:29.850153 4958 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2ac54227-1acb-4bbb-8acb-5f7bba007274/volumes/kubernetes.io~configmap/ovsdbserver-sb Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.850220 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.850457 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="cinder-scheduler" containerID="cri-o://e4cedd3f84ca670706b3ada0f0b72ef56be1156fed7f4d24577329ae7796f8e8" gracePeriod=30 Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.850557 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="probe" containerID="cri-o://fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e" gracePeriod=30 Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.851690 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.851714 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j6dh\" (UniqueName: \"kubernetes.io/projected/2ac54227-1acb-4bbb-8acb-5f7bba007274-kube-api-access-7j6dh\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.851802 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.851816 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:29 crc kubenswrapper[4958]: I1201 10:23:29.864593 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.071186 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.550748215 podStartE2EDuration="7.071151802s" podCreationTimestamp="2025-12-01 10:23:23 +0000 UTC" firstStartedPulling="2025-12-01 10:23:24.274317368 +0000 UTC m=+1451.783106405" lastFinishedPulling="2025-12-01 10:23:28.794720955 +0000 UTC m=+1456.303509992" observedRunningTime="2025-12-01 10:23:29.872224769 +0000 UTC m=+1457.381013816" watchObservedRunningTime="2025-12-01 10:23:30.071151802 +0000 UTC m=+1457.579940839" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.079598 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.084454 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-config" (OuterVolumeSpecName: "config") pod "2ac54227-1acb-4bbb-8acb-5f7bba007274" (UID: "2ac54227-1acb-4bbb-8acb-5f7bba007274"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.160779 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.160903 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" event={"ID":"2ac54227-1acb-4bbb-8acb-5f7bba007274","Type":"ContainerDied","Data":"79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2"} Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.161001 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8kpf2" event={"ID":"2ac54227-1acb-4bbb-8acb-5f7bba007274","Type":"ContainerDied","Data":"57e54fe237becf76dbbbf8c50414668dedfc7108dac8f4b05d59e6cf79665494"} Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.161139 4958 scope.go:117] "RemoveContainer" containerID="79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.161090 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerStarted","Data":"1a881e77057e5f425992f2f22218f2714109b50ea0c3077388809c62705c22e8"} Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.161606 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bff445bfd-mg9pk" event={"ID":"4777ce5e-b8d9-4486-9ebb-a8521c18dd15","Type":"ContainerDied","Data":"0800009f1f87e2b0e639aa1625f201a95856480de1a188eb073dc4dc47524d2a"} Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.186663 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac54227-1acb-4bbb-8acb-5f7bba007274-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.199949 4958 scope.go:117] "RemoveContainer" containerID="a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.206970 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kpf2"] Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.217869 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8kpf2"] Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.222815 4958 scope.go:117] "RemoveContainer" containerID="79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2" Dec 01 10:23:30 crc kubenswrapper[4958]: E1201 10:23:30.223410 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2\": container with ID starting with 79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2 not found: ID does not exist" containerID="79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.223454 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2"} err="failed to get container status \"79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2\": rpc error: code = NotFound desc = could not find container \"79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2\": container with ID starting with 79477e970a26f7574a64a41b020bf29cf4bf6946415de7a38c9aad3a1c2e7fe2 not found: ID does not exist" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.223481 4958 scope.go:117] "RemoveContainer" containerID="a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836" Dec 01 10:23:30 crc kubenswrapper[4958]: E1201 10:23:30.223897 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836\": container with ID starting with a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836 not found: ID does not exist" containerID="a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.223982 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836"} err="failed to get container status \"a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836\": rpc error: code = NotFound desc = could not find container \"a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836\": container with ID starting with a39b05773206f5d96ff2086f904af274eedb5535fa669c57806f0de6c8dc5836 not found: ID does not exist" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.350373 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.493988 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crgfd\" (UniqueName: \"kubernetes.io/projected/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-kube-api-access-crgfd\") pod \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.494210 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-ovndb-tls-certs\") pod \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.494267 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-httpd-config\") pod \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.494318 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-combined-ca-bundle\") pod \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.494387 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-config\") pod \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\" (UID: \"4777ce5e-b8d9-4486-9ebb-a8521c18dd15\") " Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.499642 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4777ce5e-b8d9-4486-9ebb-a8521c18dd15" (UID: "4777ce5e-b8d9-4486-9ebb-a8521c18dd15"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.499771 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-kube-api-access-crgfd" (OuterVolumeSpecName: "kube-api-access-crgfd") pod "4777ce5e-b8d9-4486-9ebb-a8521c18dd15" (UID: "4777ce5e-b8d9-4486-9ebb-a8521c18dd15"). InnerVolumeSpecName "kube-api-access-crgfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.552667 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4777ce5e-b8d9-4486-9ebb-a8521c18dd15" (UID: "4777ce5e-b8d9-4486-9ebb-a8521c18dd15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.554242 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-config" (OuterVolumeSpecName: "config") pod "4777ce5e-b8d9-4486-9ebb-a8521c18dd15" (UID: "4777ce5e-b8d9-4486-9ebb-a8521c18dd15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.580180 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4777ce5e-b8d9-4486-9ebb-a8521c18dd15" (UID: "4777ce5e-b8d9-4486-9ebb-a8521c18dd15"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.596706 4958 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.596744 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.596754 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.596765 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.596775 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crgfd\" (UniqueName: \"kubernetes.io/projected/4777ce5e-b8d9-4486-9ebb-a8521c18dd15-kube-api-access-crgfd\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.869868 4958 generic.go:334] "Generic (PLEG): container finished" podID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerID="fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e" exitCode=0 Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.869973 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3","Type":"ContainerDied","Data":"fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e"} Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.873290 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bff445bfd-mg9pk" event={"ID":"4777ce5e-b8d9-4486-9ebb-a8521c18dd15","Type":"ContainerDied","Data":"f25d73ad8101d9d94fbe58d238fad7c269a1d2a06d0269d8951bb908cc344ee5"} Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.873372 4958 scope.go:117] "RemoveContainer" containerID="c4e4a17cabd5aa2978dceb71175d7f962d7c7ec975783f5fd25c57bb34cf6b2f" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.873481 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bff445bfd-mg9pk" Dec 01 10:23:30 crc kubenswrapper[4958]: E1201 10:23:30.902275 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12b7b6f7_0d8a_42eb_a993_7c0bae6952f3.slice/crio-fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12b7b6f7_0d8a_42eb_a993_7c0bae6952f3.slice/crio-conmon-fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.907481 4958 scope.go:117] "RemoveContainer" containerID="0800009f1f87e2b0e639aa1625f201a95856480de1a188eb073dc4dc47524d2a" Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.931074 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bff445bfd-mg9pk"] Dec 01 10:23:30 crc kubenswrapper[4958]: I1201 10:23:30.943237 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bff445bfd-mg9pk"] Dec 01 10:23:31 crc kubenswrapper[4958]: I1201 10:23:31.809192 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" path="/var/lib/kubelet/pods/2ac54227-1acb-4bbb-8acb-5f7bba007274/volumes" Dec 01 10:23:31 crc kubenswrapper[4958]: I1201 10:23:31.810419 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" path="/var/lib/kubelet/pods/4777ce5e-b8d9-4486-9ebb-a8521c18dd15/volumes" Dec 01 10:23:31 crc kubenswrapper[4958]: I1201 10:23:31.912347 4958 generic.go:334] "Generic (PLEG): container finished" podID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerID="e4cedd3f84ca670706b3ada0f0b72ef56be1156fed7f4d24577329ae7796f8e8" exitCode=0 Dec 01 10:23:31 crc kubenswrapper[4958]: I1201 10:23:31.913547 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3","Type":"ContainerDied","Data":"e4cedd3f84ca670706b3ada0f0b72ef56be1156fed7f4d24577329ae7796f8e8"} Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.226271 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338051 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-combined-ca-bundle\") pod \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338197 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data\") pod \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338227 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-etc-machine-id\") pod \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338335 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnmds\" (UniqueName: \"kubernetes.io/projected/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-kube-api-access-gnmds\") pod \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338400 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data-custom\") pod \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338671 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-scripts\") pod \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\" (UID: \"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3\") " Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.338476 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" (UID: "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.346965 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-scripts" (OuterVolumeSpecName: "scripts") pod "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" (UID: "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.355122 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-kube-api-access-gnmds" (OuterVolumeSpecName: "kube-api-access-gnmds") pod "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" (UID: "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3"). InnerVolumeSpecName "kube-api-access-gnmds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.378114 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" (UID: "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.404991 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" (UID: "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.444771 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnmds\" (UniqueName: \"kubernetes.io/projected/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-kube-api-access-gnmds\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.444836 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.444870 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.444882 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.444894 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.554087 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data" (OuterVolumeSpecName: "config-data") pod "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" (UID: "12b7b6f7-0d8a-42eb-a993-7c0bae6952f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.650810 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.926493 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12b7b6f7-0d8a-42eb-a993-7c0bae6952f3","Type":"ContainerDied","Data":"279ee34b8262d8e61ed7519ea3a4e2251082f941fab4186b62278704f17b64f1"} Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.926828 4958 scope.go:117] "RemoveContainer" containerID="fe653af11521c51b7c497e7905d2a58b5db37ebd7b3bbdd4ad6695650c72357e" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.926723 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.972271 4958 scope.go:117] "RemoveContainer" containerID="e4cedd3f84ca670706b3ada0f0b72ef56be1156fed7f4d24577329ae7796f8e8" Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.976332 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:32 crc kubenswrapper[4958]: I1201 10:23:32.987870 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020250 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:33 crc kubenswrapper[4958]: E1201 10:23:33.020764 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-httpd" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020794 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-httpd" Dec 01 10:23:33 crc kubenswrapper[4958]: E1201 10:23:33.020811 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="probe" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020820 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="probe" Dec 01 10:23:33 crc kubenswrapper[4958]: E1201 10:23:33.020875 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerName="dnsmasq-dns" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020882 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerName="dnsmasq-dns" Dec 01 10:23:33 crc kubenswrapper[4958]: E1201 10:23:33.020898 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-api" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020904 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-api" Dec 01 10:23:33 crc kubenswrapper[4958]: E1201 10:23:33.020917 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="cinder-scheduler" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020925 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="cinder-scheduler" Dec 01 10:23:33 crc kubenswrapper[4958]: E1201 10:23:33.020941 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerName="init" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.020948 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerName="init" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.021167 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-api" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.021192 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="cinder-scheduler" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.021205 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4777ce5e-b8d9-4486-9ebb-a8521c18dd15" containerName="neutron-httpd" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.021219 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" containerName="probe" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.021232 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac54227-1acb-4bbb-8acb-5f7bba007274" containerName="dnsmasq-dns" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.022321 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.025646 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.045037 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.055605 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.157316 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.168121 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa361886-e7eb-413c-a4b4-cedaf1c86983-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.168216 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.168311 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.169380 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.169559 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-scripts\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.169708 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wtjp\" (UniqueName: \"kubernetes.io/projected/aa361886-e7eb-413c-a4b4-cedaf1c86983-kube-api-access-8wtjp\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272326 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272408 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272556 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-scripts\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272583 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wtjp\" (UniqueName: \"kubernetes.io/projected/aa361886-e7eb-413c-a4b4-cedaf1c86983-kube-api-access-8wtjp\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272642 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa361886-e7eb-413c-a4b4-cedaf1c86983-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.272731 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa361886-e7eb-413c-a4b4-cedaf1c86983-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.278604 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.278781 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.279612 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-scripts\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.282418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.322172 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wtjp\" (UniqueName: \"kubernetes.io/projected/aa361886-e7eb-413c-a4b4-cedaf1c86983-kube-api-access-8wtjp\") pod \"cinder-scheduler-0\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.344628 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.751114 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.818211 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b7b6f7-0d8a-42eb-a993-7c0bae6952f3" path="/var/lib/kubelet/pods/12b7b6f7-0d8a-42eb-a993-7c0bae6952f3/volumes" Dec 01 10:23:33 crc kubenswrapper[4958]: I1201 10:23:33.968769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:23:33 crc kubenswrapper[4958]: W1201 10:23:33.972639 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa361886_e7eb_413c_a4b4_cedaf1c86983.slice/crio-c852bf51f34c89850c8ffbbf2aea50bb85f5125da81ea0979ef51563645afa6a WatchSource:0}: Error finding container c852bf51f34c89850c8ffbbf2aea50bb85f5125da81ea0979ef51563645afa6a: Status 404 returned error can't find the container with id c852bf51f34c89850c8ffbbf2aea50bb85f5125da81ea0979ef51563645afa6a Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.421086 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.423472 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.425941 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.426441 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4qf4d" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.429744 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.455930 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.511295 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.511408 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.511513 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlks4\" (UniqueName: \"kubernetes.io/projected/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-kube-api-access-wlks4\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.511552 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config-secret\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.613815 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlks4\" (UniqueName: \"kubernetes.io/projected/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-kube-api-access-wlks4\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.613963 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config-secret\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.614088 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.614192 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.615555 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.620774 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.622560 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config-secret\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.631780 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlks4\" (UniqueName: \"kubernetes.io/projected/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-kube-api-access-wlks4\") pod \"openstackclient\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.755662 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.964830 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa361886-e7eb-413c-a4b4-cedaf1c86983","Type":"ContainerStarted","Data":"9edf6c1c15c8c61a73a71e42e83b06f5d03b0b267c8f2a39955fe70dd308453e"} Dec 01 10:23:34 crc kubenswrapper[4958]: I1201 10:23:34.965251 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa361886-e7eb-413c-a4b4-cedaf1c86983","Type":"ContainerStarted","Data":"c852bf51f34c89850c8ffbbf2aea50bb85f5125da81ea0979ef51563645afa6a"} Dec 01 10:23:35 crc kubenswrapper[4958]: I1201 10:23:35.307392 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 10:23:35 crc kubenswrapper[4958]: I1201 10:23:35.981265 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa361886-e7eb-413c-a4b4-cedaf1c86983","Type":"ContainerStarted","Data":"41cd43dae598fc0fb9dde4e0161ce7d640d805585cd73ef9e8521ddf5b485506"} Dec 01 10:23:35 crc kubenswrapper[4958]: I1201 10:23:35.982592 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a2bd3b82-a2cb-40ac-8a52-6016d62a4040","Type":"ContainerStarted","Data":"754f1a3a1b81bbb99fab07e25ff084d7a395534470317c74d8731728f4246521"} Dec 01 10:23:36 crc kubenswrapper[4958]: I1201 10:23:36.015399 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.015361927 podStartE2EDuration="4.015361927s" podCreationTimestamp="2025-12-01 10:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:35.999775488 +0000 UTC m=+1463.508564535" watchObservedRunningTime="2025-12-01 10:23:36.015361927 +0000 UTC m=+1463.524150974" Dec 01 10:23:37 crc kubenswrapper[4958]: I1201 10:23:37.985868 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 10:23:38 crc kubenswrapper[4958]: I1201 10:23:38.345417 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.098613 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5ffd4b498c-jxtxd"] Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.102871 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.109968 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.111406 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.117042 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.119085 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5ffd4b498c-jxtxd"] Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258320 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-config-data\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258382 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-log-httpd\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258427 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-internal-tls-certs\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258468 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-run-httpd\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258508 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-combined-ca-bundle\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258570 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85nf7\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-kube-api-access-85nf7\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258646 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-public-tls-certs\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.258712 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-etc-swift\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361505 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-public-tls-certs\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361666 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-etc-swift\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361711 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-config-data\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361734 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-log-httpd\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361770 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-internal-tls-certs\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361810 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-run-httpd\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361871 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-combined-ca-bundle\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.361934 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85nf7\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-kube-api-access-85nf7\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.363155 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-run-httpd\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.365613 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-log-httpd\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.370086 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-internal-tls-certs\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.373293 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-combined-ca-bundle\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.378595 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-config-data\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.384043 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-etc-swift\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.386881 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85nf7\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-kube-api-access-85nf7\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.387435 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-public-tls-certs\") pod \"swift-proxy-5ffd4b498c-jxtxd\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.440412 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.772643 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.773159 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="proxy-httpd" containerID="cri-o://1a881e77057e5f425992f2f22218f2714109b50ea0c3077388809c62705c22e8" gracePeriod=30 Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.773279 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-notification-agent" containerID="cri-o://fd7676d3d11d0fb0ddfceb74709863fdef5528c7a8543875b4942b203b552cf0" gracePeriod=30 Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.773080 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-central-agent" containerID="cri-o://04a9513e76310c8aa9a53be9080554fdae6410de5597747d6f820f942551f7ae" gracePeriod=30 Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.773613 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="sg-core" containerID="cri-o://76790b81bc19590d0a5800b9660e76d2f0876a731e941b4080087e7baa5c0d9f" gracePeriod=30 Dec 01 10:23:40 crc kubenswrapper[4958]: I1201 10:23:40.881149 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.161:3000/\": read tcp 10.217.0.2:46682->10.217.0.161:3000: read: connection reset by peer" Dec 01 10:23:41 crc kubenswrapper[4958]: I1201 10:23:41.109487 4958 generic.go:334] "Generic (PLEG): container finished" podID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerID="1a881e77057e5f425992f2f22218f2714109b50ea0c3077388809c62705c22e8" exitCode=0 Dec 01 10:23:41 crc kubenswrapper[4958]: I1201 10:23:41.109538 4958 generic.go:334] "Generic (PLEG): container finished" podID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerID="76790b81bc19590d0a5800b9660e76d2f0876a731e941b4080087e7baa5c0d9f" exitCode=2 Dec 01 10:23:41 crc kubenswrapper[4958]: I1201 10:23:41.109569 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerDied","Data":"1a881e77057e5f425992f2f22218f2714109b50ea0c3077388809c62705c22e8"} Dec 01 10:23:41 crc kubenswrapper[4958]: I1201 10:23:41.109607 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerDied","Data":"76790b81bc19590d0a5800b9660e76d2f0876a731e941b4080087e7baa5c0d9f"} Dec 01 10:23:42 crc kubenswrapper[4958]: I1201 10:23:42.126215 4958 generic.go:334] "Generic (PLEG): container finished" podID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerID="04a9513e76310c8aa9a53be9080554fdae6410de5597747d6f820f942551f7ae" exitCode=0 Dec 01 10:23:42 crc kubenswrapper[4958]: I1201 10:23:42.126316 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerDied","Data":"04a9513e76310c8aa9a53be9080554fdae6410de5597747d6f820f942551f7ae"} Dec 01 10:23:42 crc kubenswrapper[4958]: I1201 10:23:42.681696 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:42 crc kubenswrapper[4958]: I1201 10:23:42.682030 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-log" containerID="cri-o://229d8f5adb170e7f4f64563c1a6b88f5847b5a0c0a31393efebbb413c3b7d7ea" gracePeriod=30 Dec 01 10:23:42 crc kubenswrapper[4958]: I1201 10:23:42.682315 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-httpd" containerID="cri-o://33988a8e2a0e68eb0848002243b370adc826f5db90d39158152d4c45fdea722a" gracePeriod=30 Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.075353 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xxjrm"] Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.079811 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.114936 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xxjrm"] Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.147364 4958 generic.go:334] "Generic (PLEG): container finished" podID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerID="229d8f5adb170e7f4f64563c1a6b88f5847b5a0c0a31393efebbb413c3b7d7ea" exitCode=143 Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.147435 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0a0701a-31c0-41fb-ba17-1919483e6248","Type":"ContainerDied","Data":"229d8f5adb170e7f4f64563c1a6b88f5847b5a0c0a31393efebbb413c3b7d7ea"} Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.168696 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bnhb6"] Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.170985 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.181215 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bnhb6"] Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.272145 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-647k8\" (UniqueName: \"kubernetes.io/projected/9324a903-818e-4d56-a4cc-2d8d68994c39-kube-api-access-647k8\") pod \"nova-api-db-create-xxjrm\" (UID: \"9324a903-818e-4d56-a4cc-2d8d68994c39\") " pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.277273 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w85js"] Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.279028 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.287202 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w85js"] Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.375447 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-647k8\" (UniqueName: \"kubernetes.io/projected/9324a903-818e-4d56-a4cc-2d8d68994c39-kube-api-access-647k8\") pod \"nova-api-db-create-xxjrm\" (UID: \"9324a903-818e-4d56-a4cc-2d8d68994c39\") " pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.375559 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsmh\" (UniqueName: \"kubernetes.io/projected/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee-kube-api-access-4bsmh\") pod \"nova-cell1-db-create-w85js\" (UID: \"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee\") " pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.375632 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxshx\" (UniqueName: \"kubernetes.io/projected/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5-kube-api-access-cxshx\") pod \"nova-cell0-db-create-bnhb6\" (UID: \"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5\") " pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.417212 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-647k8\" (UniqueName: \"kubernetes.io/projected/9324a903-818e-4d56-a4cc-2d8d68994c39-kube-api-access-647k8\") pod \"nova-api-db-create-xxjrm\" (UID: \"9324a903-818e-4d56-a4cc-2d8d68994c39\") " pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.479662 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsmh\" (UniqueName: \"kubernetes.io/projected/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee-kube-api-access-4bsmh\") pod \"nova-cell1-db-create-w85js\" (UID: \"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee\") " pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.479916 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxshx\" (UniqueName: \"kubernetes.io/projected/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5-kube-api-access-cxshx\") pod \"nova-cell0-db-create-bnhb6\" (UID: \"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5\") " pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.518036 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsmh\" (UniqueName: \"kubernetes.io/projected/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee-kube-api-access-4bsmh\") pod \"nova-cell1-db-create-w85js\" (UID: \"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee\") " pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.526675 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxshx\" (UniqueName: \"kubernetes.io/projected/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5-kube-api-access-cxshx\") pod \"nova-cell0-db-create-bnhb6\" (UID: \"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5\") " pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.601133 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.716905 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.760180 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 10:23:43 crc kubenswrapper[4958]: I1201 10:23:43.798181 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.007209 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.007559 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-log" containerID="cri-o://8b863f0e6c8d8b91a9d5704cb7d22e94530c2769f2f5ebdea63f43036fdbe22a" gracePeriod=30 Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.008312 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-httpd" containerID="cri-o://464ae8b776e0c5220b16bba7022f58ec4374bbec97ffa730004a713c7a084c74" gracePeriod=30 Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.190720 4958 generic.go:334] "Generic (PLEG): container finished" podID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerID="33988a8e2a0e68eb0848002243b370adc826f5db90d39158152d4c45fdea722a" exitCode=0 Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.190873 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0a0701a-31c0-41fb-ba17-1919483e6248","Type":"ContainerDied","Data":"33988a8e2a0e68eb0848002243b370adc826f5db90d39158152d4c45fdea722a"} Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.197411 4958 generic.go:334] "Generic (PLEG): container finished" podID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerID="fd7676d3d11d0fb0ddfceb74709863fdef5528c7a8543875b4942b203b552cf0" exitCode=0 Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.197571 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerDied","Data":"fd7676d3d11d0fb0ddfceb74709863fdef5528c7a8543875b4942b203b552cf0"} Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.203083 4958 generic.go:334] "Generic (PLEG): container finished" podID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerID="8b863f0e6c8d8b91a9d5704cb7d22e94530c2769f2f5ebdea63f43036fdbe22a" exitCode=143 Dec 01 10:23:46 crc kubenswrapper[4958]: I1201 10:23:46.203155 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36","Type":"ContainerDied","Data":"8b863f0e6c8d8b91a9d5704cb7d22e94530c2769f2f5ebdea63f43036fdbe22a"} Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.081627 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.167945 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-config-data\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.168041 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-run-httpd\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.168095 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcl7f\" (UniqueName: \"kubernetes.io/projected/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-kube-api-access-dcl7f\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.168126 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-log-httpd\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.168165 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-combined-ca-bundle\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.168257 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-scripts\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.168421 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-sg-core-conf-yaml\") pod \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\" (UID: \"bb79c046-eebf-48b2-b2d0-e76b343f0d5a\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.169120 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.171690 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.187644 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-kube-api-access-dcl7f" (OuterVolumeSpecName: "kube-api-access-dcl7f") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "kube-api-access-dcl7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.192076 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-scripts" (OuterVolumeSpecName: "scripts") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.273112 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.273163 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcl7f\" (UniqueName: \"kubernetes.io/projected/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-kube-api-access-dcl7f\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.273185 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.273211 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.274043 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb79c046-eebf-48b2-b2d0-e76b343f0d5a","Type":"ContainerDied","Data":"87f539ca23b337960dee907c8d7ef88109496c24f96ab8f43aa828a5968c5317"} Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.274105 4958 scope.go:117] "RemoveContainer" containerID="1a881e77057e5f425992f2f22218f2714109b50ea0c3077388809c62705c22e8" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.274319 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.344578 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.372101 4958 scope.go:117] "RemoveContainer" containerID="76790b81bc19590d0a5800b9660e76d2f0876a731e941b4080087e7baa5c0d9f" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.375377 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.424615 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.430255 4958 scope.go:117] "RemoveContainer" containerID="fd7676d3d11d0fb0ddfceb74709863fdef5528c7a8543875b4942b203b552cf0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.438112 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-config-data" (OuterVolumeSpecName: "config-data") pod "bb79c046-eebf-48b2-b2d0-e76b343f0d5a" (UID: "bb79c046-eebf-48b2-b2d0-e76b343f0d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.465274 4958 scope.go:117] "RemoveContainer" containerID="04a9513e76310c8aa9a53be9080554fdae6410de5597747d6f820f942551f7ae" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.479890 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.480159 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb79c046-eebf-48b2-b2d0-e76b343f0d5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.521830 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582433 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-public-tls-certs\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582542 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582587 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-combined-ca-bundle\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582656 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-httpd-run\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582713 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-logs\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582770 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjqk7\" (UniqueName: \"kubernetes.io/projected/c0a0701a-31c0-41fb-ba17-1919483e6248-kube-api-access-wjqk7\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582872 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-config-data\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.582951 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-scripts\") pod \"c0a0701a-31c0-41fb-ba17-1919483e6248\" (UID: \"c0a0701a-31c0-41fb-ba17-1919483e6248\") " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.583799 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-logs" (OuterVolumeSpecName: "logs") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.589202 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.589885 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.590635 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a0701a-31c0-41fb-ba17-1919483e6248-kube-api-access-wjqk7" (OuterVolumeSpecName: "kube-api-access-wjqk7") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "kube-api-access-wjqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.593546 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-scripts" (OuterVolumeSpecName: "scripts") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.678593 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.679338 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.685893 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.685975 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.685989 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.686002 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.686014 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a0701a-31c0-41fb-ba17-1919483e6248-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.686025 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjqk7\" (UniqueName: \"kubernetes.io/projected/c0a0701a-31c0-41fb-ba17-1919483e6248-kube-api-access-wjqk7\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.686039 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.706780 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-config-data" (OuterVolumeSpecName: "config-data") pod "c0a0701a-31c0-41fb-ba17-1919483e6248" (UID: "c0a0701a-31c0-41fb-ba17-1919483e6248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.720271 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.788025 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.788427 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a0701a-31c0-41fb-ba17-1919483e6248-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.823703 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.829869 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.863314 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:48 crc kubenswrapper[4958]: E1201 10:23:48.864194 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="proxy-httpd" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864228 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="proxy-httpd" Dec 01 10:23:48 crc kubenswrapper[4958]: E1201 10:23:48.864250 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-notification-agent" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864260 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-notification-agent" Dec 01 10:23:48 crc kubenswrapper[4958]: E1201 10:23:48.864290 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-httpd" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864298 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-httpd" Dec 01 10:23:48 crc kubenswrapper[4958]: E1201 10:23:48.864317 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-central-agent" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864325 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-central-agent" Dec 01 10:23:48 crc kubenswrapper[4958]: E1201 10:23:48.864354 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="sg-core" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864364 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="sg-core" Dec 01 10:23:48 crc kubenswrapper[4958]: E1201 10:23:48.864376 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-log" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864384 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-log" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864879 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-log" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864910 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="proxy-httpd" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864924 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-notification-agent" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864937 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="ceilometer-central-agent" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864950 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" containerName="sg-core" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.864972 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" containerName="glance-httpd" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.870437 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.877957 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.878810 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.887581 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.901289 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w85js"] Dec 01 10:23:48 crc kubenswrapper[4958]: W1201 10:23:48.974932 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5588e33_3bf3_4ade_99d6_f0f5c26c62b5.slice/crio-abe086e2554ee1cde93909c2b3abae46d83b229b8d09c345b693f17707c8a1c3 WatchSource:0}: Error finding container abe086e2554ee1cde93909c2b3abae46d83b229b8d09c345b693f17707c8a1c3: Status 404 returned error can't find the container with id abe086e2554ee1cde93909c2b3abae46d83b229b8d09c345b693f17707c8a1c3 Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.975040 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bnhb6"] Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.994131 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.994237 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxz5t\" (UniqueName: \"kubernetes.io/projected/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-kube-api-access-mxz5t\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.994298 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-log-httpd\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.994329 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.994623 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-config-data\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.995996 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-run-httpd\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.996125 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-scripts\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:48 crc kubenswrapper[4958]: I1201 10:23:48.998766 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xxjrm"] Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.065683 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5ffd4b498c-jxtxd"] Dec 01 10:23:49 crc kubenswrapper[4958]: W1201 10:23:49.079603 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd93b36ff_312b_40e1_9e3a_b1981800da66.slice/crio-ffd94f974b28f659374220632a6bea30500f988fba858f8cb32a3bb61aefd56b WatchSource:0}: Error finding container ffd94f974b28f659374220632a6bea30500f988fba858f8cb32a3bb61aefd56b: Status 404 returned error can't find the container with id ffd94f974b28f659374220632a6bea30500f988fba858f8cb32a3bb61aefd56b Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.098522 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-scripts\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.098642 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.098700 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxz5t\" (UniqueName: \"kubernetes.io/projected/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-kube-api-access-mxz5t\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.098731 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-log-httpd\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.098758 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.098805 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-config-data\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.100755 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-log-httpd\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.101390 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-run-httpd\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.102365 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-run-httpd\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.110870 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-config-data\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.112219 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.118800 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-scripts\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.119317 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.125579 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxz5t\" (UniqueName: \"kubernetes.io/projected/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-kube-api-access-mxz5t\") pod \"ceilometer-0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.267711 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.316975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a2bd3b82-a2cb-40ac-8a52-6016d62a4040","Type":"ContainerStarted","Data":"1118e86476db84520b916ab9171d00c9ca74a0fda0b9bbab6d99166ab5c31c33"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.320995 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.328222 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c0a0701a-31c0-41fb-ba17-1919483e6248","Type":"ContainerDied","Data":"dec08121b3df3ca5ed69f73b62573608d9253d17dcc314f15f7399a9134328f5"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.328304 4958 scope.go:117] "RemoveContainer" containerID="33988a8e2a0e68eb0848002243b370adc826f5db90d39158152d4c45fdea722a" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.328515 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.342420 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.000323926 podStartE2EDuration="15.342388888s" podCreationTimestamp="2025-12-01 10:23:34 +0000 UTC" firstStartedPulling="2025-12-01 10:23:35.399107255 +0000 UTC m=+1462.907896292" lastFinishedPulling="2025-12-01 10:23:47.741172217 +0000 UTC m=+1475.249961254" observedRunningTime="2025-12-01 10:23:49.341625846 +0000 UTC m=+1476.850414883" watchObservedRunningTime="2025-12-01 10:23:49.342388888 +0000 UTC m=+1476.851177925" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.349617 4958 generic.go:334] "Generic (PLEG): container finished" podID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerID="464ae8b776e0c5220b16bba7022f58ec4374bbec97ffa730004a713c7a084c74" exitCode=0 Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.349762 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36","Type":"ContainerDied","Data":"464ae8b776e0c5220b16bba7022f58ec4374bbec97ffa730004a713c7a084c74"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.356046 4958 generic.go:334] "Generic (PLEG): container finished" podID="9324a903-818e-4d56-a4cc-2d8d68994c39" containerID="5bda70cb089e79ebe24f795c82ed089dd6bc77e03b0ab3ec28946a52904bf64b" exitCode=0 Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.356177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xxjrm" event={"ID":"9324a903-818e-4d56-a4cc-2d8d68994c39","Type":"ContainerDied","Data":"5bda70cb089e79ebe24f795c82ed089dd6bc77e03b0ab3ec28946a52904bf64b"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.356218 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xxjrm" event={"ID":"9324a903-818e-4d56-a4cc-2d8d68994c39","Type":"ContainerStarted","Data":"e51f40a09a93250ea6a947c2e6263675c1fcc85e5db68d3f9b841c31a760f951"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.360958 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" event={"ID":"d93b36ff-312b-40e1-9e3a-b1981800da66","Type":"ContainerStarted","Data":"ffd94f974b28f659374220632a6bea30500f988fba858f8cb32a3bb61aefd56b"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.365451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bnhb6" event={"ID":"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5","Type":"ContainerStarted","Data":"7d7a5225745487604efb9f143377215aa48f67ee4250f1a8e483b04bf43b8176"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.365518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bnhb6" event={"ID":"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5","Type":"ContainerStarted","Data":"abe086e2554ee1cde93909c2b3abae46d83b229b8d09c345b693f17707c8a1c3"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.369161 4958 generic.go:334] "Generic (PLEG): container finished" podID="3ee9c21e-3f59-4544-b5b9-f91bfdb74aee" containerID="e80dc32b41343dbfc30bfd6b78f32bf55eb0ee1658dc3ebf612e8b8f2e8503f1" exitCode=0 Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.369223 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w85js" event={"ID":"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee","Type":"ContainerDied","Data":"e80dc32b41343dbfc30bfd6b78f32bf55eb0ee1658dc3ebf612e8b8f2e8503f1"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.369257 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w85js" event={"ID":"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee","Type":"ContainerStarted","Data":"2469a1b64e9e4bdc893968d52218cf5df9bcfc6ed4651d1ee50915dd25ddba83"} Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.433495 4958 scope.go:117] "RemoveContainer" containerID="229d8f5adb170e7f4f64563c1a6b88f5847b5a0c0a31393efebbb413c3b7d7ea" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.527027 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.561287 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.616268 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.626927 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.632944 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.634432 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.638649 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720005 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t556\" (UniqueName: \"kubernetes.io/projected/30e8c723-a7a0-4697-8369-bd224fcfdf3f-kube-api-access-7t556\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720077 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720129 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720252 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720308 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720371 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720400 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.720452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-logs\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.879927 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb79c046-eebf-48b2-b2d0-e76b343f0d5a" path="/var/lib/kubelet/pods/bb79c046-eebf-48b2-b2d0-e76b343f0d5a/volumes" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.883079 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.883182 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.883271 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.883398 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-logs\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.883443 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t556\" (UniqueName: \"kubernetes.io/projected/30e8c723-a7a0-4697-8369-bd224fcfdf3f-kube-api-access-7t556\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.883481 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.884009 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-logs\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.885574 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.885943 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.886109 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.886541 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a0701a-31c0-41fb-ba17-1919483e6248" path="/var/lib/kubelet/pods/c0a0701a-31c0-41fb-ba17-1919483e6248/volumes" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.888246 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.899647 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.900082 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.904431 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.916813 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.926061 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t556\" (UniqueName: \"kubernetes.io/projected/30e8c723-a7a0-4697-8369-bd224fcfdf3f-kube-api-access-7t556\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:49 crc kubenswrapper[4958]: I1201 10:23:49.964265 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " pod="openstack/glance-default-external-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.004512 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.016518 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.171251 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.278966 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-logs\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279106 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-combined-ca-bundle\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279233 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-internal-tls-certs\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279274 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-scripts\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279302 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279331 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj22c\" (UniqueName: \"kubernetes.io/projected/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-kube-api-access-mj22c\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279395 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-httpd-run\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279565 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-config-data\") pod \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\" (UID: \"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36\") " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.279743 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-logs" (OuterVolumeSpecName: "logs") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.281516 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.285435 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.287254 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-scripts" (OuterVolumeSpecName: "scripts") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.287328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-kube-api-access-mj22c" (OuterVolumeSpecName: "kube-api-access-mj22c") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "kube-api-access-mj22c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.287362 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.331352 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.379909 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-config-data" (OuterVolumeSpecName: "config-data") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.385467 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.385525 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.385540 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj22c\" (UniqueName: \"kubernetes.io/projected/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-kube-api-access-mj22c\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.385552 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.385565 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.385580 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.400177 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" (UID: "fb8ee44f-e47e-4c73-b49c-cfa77e57cf36"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.416825 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.418754 4958 generic.go:334] "Generic (PLEG): container finished" podID="a5588e33-3bf3-4ade-99d6-f0f5c26c62b5" containerID="7d7a5225745487604efb9f143377215aa48f67ee4250f1a8e483b04bf43b8176" exitCode=0 Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.418899 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bnhb6" event={"ID":"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5","Type":"ContainerDied","Data":"7d7a5225745487604efb9f143377215aa48f67ee4250f1a8e483b04bf43b8176"} Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.427906 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerStarted","Data":"89bde2c62bd47b45cc7e094321d9e456abd6a7620ed60244f791217c891a3256"} Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.452835 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fb8ee44f-e47e-4c73-b49c-cfa77e57cf36","Type":"ContainerDied","Data":"3acff3975fa12250f8f9f134e07ffc456f32c94d5176a5472fb6c7b426065fbb"} Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.453539 4958 scope.go:117] "RemoveContainer" containerID="464ae8b776e0c5220b16bba7022f58ec4374bbec97ffa730004a713c7a084c74" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.453217 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.470822 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" event={"ID":"d93b36ff-312b-40e1-9e3a-b1981800da66","Type":"ContainerStarted","Data":"cfeb39b686f44466325e4103c9715785337ae56a4d6010ba306c05d7d1433916"} Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.470900 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" event={"ID":"d93b36ff-312b-40e1-9e3a-b1981800da66","Type":"ContainerStarted","Data":"912676adc0639820ea8a098452e8e4b85618ac2b2c6eda07e38c9a1e111eefc8"} Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.471076 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.471099 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.489832 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.489901 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.533656 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" podStartSLOduration=10.533627705 podStartE2EDuration="10.533627705s" podCreationTimestamp="2025-12-01 10:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:50.504420514 +0000 UTC m=+1478.013209551" watchObservedRunningTime="2025-12-01 10:23:50.533627705 +0000 UTC m=+1478.042416742" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.548885 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.588314 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.612425 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:50 crc kubenswrapper[4958]: E1201 10:23:50.612965 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-httpd" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.612981 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-httpd" Dec 01 10:23:50 crc kubenswrapper[4958]: E1201 10:23:50.613523 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-log" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.613539 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-log" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.613773 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-httpd" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.613790 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" containerName="glance-log" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.616081 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.623446 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.623715 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.640780 4958 scope.go:117] "RemoveContainer" containerID="8b863f0e6c8d8b91a9d5704cb7d22e94530c2769f2f5ebdea63f43036fdbe22a" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.641937 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.702380 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.702548 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.702657 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.702787 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.703066 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.703169 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5rb\" (UniqueName: \"kubernetes.io/projected/0353bee2-4033-4493-9217-b5c4600d3d90-kube-api-access-cz5rb\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.703219 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-logs\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.703267 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.806658 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5rb\" (UniqueName: \"kubernetes.io/projected/0353bee2-4033-4493-9217-b5c4600d3d90-kube-api-access-cz5rb\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807103 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-logs\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807160 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807568 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807726 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807821 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807875 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-logs\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.807953 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.808005 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.808308 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.810120 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.817034 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.824716 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.825293 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.834659 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.836544 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5rb\" (UniqueName: \"kubernetes.io/projected/0353bee2-4033-4493-9217-b5c4600d3d90-kube-api-access-cz5rb\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.876064 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:23:50 crc kubenswrapper[4958]: I1201 10:23:50.978879 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.147631 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.181739 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.216176 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-647k8\" (UniqueName: \"kubernetes.io/projected/9324a903-818e-4d56-a4cc-2d8d68994c39-kube-api-access-647k8\") pod \"9324a903-818e-4d56-a4cc-2d8d68994c39\" (UID: \"9324a903-818e-4d56-a4cc-2d8d68994c39\") " Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.216220 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bsmh\" (UniqueName: \"kubernetes.io/projected/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee-kube-api-access-4bsmh\") pod \"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee\" (UID: \"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee\") " Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.224363 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9324a903-818e-4d56-a4cc-2d8d68994c39-kube-api-access-647k8" (OuterVolumeSpecName: "kube-api-access-647k8") pod "9324a903-818e-4d56-a4cc-2d8d68994c39" (UID: "9324a903-818e-4d56-a4cc-2d8d68994c39"). InnerVolumeSpecName "kube-api-access-647k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.226799 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee-kube-api-access-4bsmh" (OuterVolumeSpecName: "kube-api-access-4bsmh") pod "3ee9c21e-3f59-4544-b5b9-f91bfdb74aee" (UID: "3ee9c21e-3f59-4544-b5b9-f91bfdb74aee"). InnerVolumeSpecName "kube-api-access-4bsmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.240418 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.250068 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.321708 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxshx\" (UniqueName: \"kubernetes.io/projected/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5-kube-api-access-cxshx\") pod \"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5\" (UID: \"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5\") " Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.323929 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-647k8\" (UniqueName: \"kubernetes.io/projected/9324a903-818e-4d56-a4cc-2d8d68994c39-kube-api-access-647k8\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.323963 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bsmh\" (UniqueName: \"kubernetes.io/projected/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee-kube-api-access-4bsmh\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.326885 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5-kube-api-access-cxshx" (OuterVolumeSpecName: "kube-api-access-cxshx") pod "a5588e33-3bf3-4ade-99d6-f0f5c26c62b5" (UID: "a5588e33-3bf3-4ade-99d6-f0f5c26c62b5"). InnerVolumeSpecName "kube-api-access-cxshx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.437625 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxshx\" (UniqueName: \"kubernetes.io/projected/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5-kube-api-access-cxshx\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.632268 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xxjrm" event={"ID":"9324a903-818e-4d56-a4cc-2d8d68994c39","Type":"ContainerDied","Data":"e51f40a09a93250ea6a947c2e6263675c1fcc85e5db68d3f9b841c31a760f951"} Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.632332 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e51f40a09a93250ea6a947c2e6263675c1fcc85e5db68d3f9b841c31a760f951" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.632436 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xxjrm" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.677285 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bnhb6" event={"ID":"a5588e33-3bf3-4ade-99d6-f0f5c26c62b5","Type":"ContainerDied","Data":"abe086e2554ee1cde93909c2b3abae46d83b229b8d09c345b693f17707c8a1c3"} Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.677678 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe086e2554ee1cde93909c2b3abae46d83b229b8d09c345b693f17707c8a1c3" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.677945 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bnhb6" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.767708 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w85js" event={"ID":"3ee9c21e-3f59-4544-b5b9-f91bfdb74aee","Type":"ContainerDied","Data":"2469a1b64e9e4bdc893968d52218cf5df9bcfc6ed4651d1ee50915dd25ddba83"} Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.767798 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2469a1b64e9e4bdc893968d52218cf5df9bcfc6ed4651d1ee50915dd25ddba83" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.767985 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w85js" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.939346 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8ee44f-e47e-4c73-b49c-cfa77e57cf36" path="/var/lib/kubelet/pods/fb8ee44f-e47e-4c73-b49c-cfa77e57cf36/volumes" Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.944173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerStarted","Data":"85ebe13881c71a02a5528b814afad3a47c9155fc0cc046e2c557a5988df56704"} Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.944234 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30e8c723-a7a0-4697-8369-bd224fcfdf3f","Type":"ContainerStarted","Data":"59f7d5726e3c9d6cd29a48f19e5d46961290d8f324c2aabd710ec135f667e5f3"} Dec 01 10:23:51 crc kubenswrapper[4958]: I1201 10:23:51.966840 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:23:52 crc kubenswrapper[4958]: I1201 10:23:52.864498 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30e8c723-a7a0-4697-8369-bd224fcfdf3f","Type":"ContainerStarted","Data":"5df0d7f8f330d79dac217b44ff6f358c9415547f0327ca2889719598f61e6c85"} Dec 01 10:23:52 crc kubenswrapper[4958]: I1201 10:23:52.872991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0353bee2-4033-4493-9217-b5c4600d3d90","Type":"ContainerStarted","Data":"c91bcad9a74692806a75938fd71bbdd6bb01e025193303b768a42f67d9a38201"} Dec 01 10:23:52 crc kubenswrapper[4958]: I1201 10:23:52.878936 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerStarted","Data":"779133432c09a5ac0a28000e695242c5c1586688638fa77aad08eb7bd341cc55"} Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.272315 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1865-account-create-gmczv"] Dec 01 10:23:53 crc kubenswrapper[4958]: E1201 10:23:53.273343 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5588e33-3bf3-4ade-99d6-f0f5c26c62b5" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.273373 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5588e33-3bf3-4ade-99d6-f0f5c26c62b5" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: E1201 10:23:53.273431 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9324a903-818e-4d56-a4cc-2d8d68994c39" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.273439 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9324a903-818e-4d56-a4cc-2d8d68994c39" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: E1201 10:23:53.273452 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee9c21e-3f59-4544-b5b9-f91bfdb74aee" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.273460 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee9c21e-3f59-4544-b5b9-f91bfdb74aee" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.273666 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee9c21e-3f59-4544-b5b9-f91bfdb74aee" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.273699 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9324a903-818e-4d56-a4cc-2d8d68994c39" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.273714 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5588e33-3bf3-4ade-99d6-f0f5c26c62b5" containerName="mariadb-database-create" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.275099 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.278570 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.288208 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1865-account-create-gmczv"] Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.421309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvxt7\" (UniqueName: \"kubernetes.io/projected/db5bc0e8-47e9-4229-87df-b605ccc638b1-kube-api-access-hvxt7\") pod \"nova-api-1865-account-create-gmczv\" (UID: \"db5bc0e8-47e9-4229-87df-b605ccc638b1\") " pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.465957 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-59ce-account-create-2skt7"] Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.468057 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.472351 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-59ce-account-create-2skt7"] Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.474364 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.524268 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvxt7\" (UniqueName: \"kubernetes.io/projected/db5bc0e8-47e9-4229-87df-b605ccc638b1-kube-api-access-hvxt7\") pod \"nova-api-1865-account-create-gmczv\" (UID: \"db5bc0e8-47e9-4229-87df-b605ccc638b1\") " pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.548889 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvxt7\" (UniqueName: \"kubernetes.io/projected/db5bc0e8-47e9-4229-87df-b605ccc638b1-kube-api-access-hvxt7\") pod \"nova-api-1865-account-create-gmczv\" (UID: \"db5bc0e8-47e9-4229-87df-b605ccc638b1\") " pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.626751 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jpbb\" (UniqueName: \"kubernetes.io/projected/58c5b092-3de8-439d-a3aa-8aabc962d87a-kube-api-access-7jpbb\") pod \"nova-cell0-59ce-account-create-2skt7\" (UID: \"58c5b092-3de8-439d-a3aa-8aabc962d87a\") " pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.628271 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.680156 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6600-account-create-48jzj"] Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.682308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.687011 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.714071 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6600-account-create-48jzj"] Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.732682 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jpbb\" (UniqueName: \"kubernetes.io/projected/58c5b092-3de8-439d-a3aa-8aabc962d87a-kube-api-access-7jpbb\") pod \"nova-cell0-59ce-account-create-2skt7\" (UID: \"58c5b092-3de8-439d-a3aa-8aabc962d87a\") " pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.757186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jpbb\" (UniqueName: \"kubernetes.io/projected/58c5b092-3de8-439d-a3aa-8aabc962d87a-kube-api-access-7jpbb\") pod \"nova-cell0-59ce-account-create-2skt7\" (UID: \"58c5b092-3de8-439d-a3aa-8aabc962d87a\") " pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.816765 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.835275 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfg4r\" (UniqueName: \"kubernetes.io/projected/d3fb3017-de2a-4213-b8ae-46dbc306cbe7-kube-api-access-nfg4r\") pod \"nova-cell1-6600-account-create-48jzj\" (UID: \"d3fb3017-de2a-4213-b8ae-46dbc306cbe7\") " pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.930107 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerStarted","Data":"a560e0aeda3d3a15ea1ed0b1acd1f4a0c67e1b2e32b5828d9f7223ba1ba083f4"} Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.933442 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30e8c723-a7a0-4697-8369-bd224fcfdf3f","Type":"ContainerStarted","Data":"9144cb56a423613b84457fe40b192f906f9bb58de4d5fb7b219e438f9744439a"} Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.936501 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfg4r\" (UniqueName: \"kubernetes.io/projected/d3fb3017-de2a-4213-b8ae-46dbc306cbe7-kube-api-access-nfg4r\") pod \"nova-cell1-6600-account-create-48jzj\" (UID: \"d3fb3017-de2a-4213-b8ae-46dbc306cbe7\") " pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.943001 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0353bee2-4033-4493-9217-b5c4600d3d90","Type":"ContainerStarted","Data":"7f2380f1568f2f9c175fd6ffac5bc9a4006ac5eca2dd568b473ca89bfac956f6"} Dec 01 10:23:53 crc kubenswrapper[4958]: I1201 10:23:53.978443 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.978405702 podStartE2EDuration="4.978405702s" podCreationTimestamp="2025-12-01 10:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:53.970823954 +0000 UTC m=+1481.479612991" watchObservedRunningTime="2025-12-01 10:23:53.978405702 +0000 UTC m=+1481.487194739" Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.010996 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfg4r\" (UniqueName: \"kubernetes.io/projected/d3fb3017-de2a-4213-b8ae-46dbc306cbe7-kube-api-access-nfg4r\") pod \"nova-cell1-6600-account-create-48jzj\" (UID: \"d3fb3017-de2a-4213-b8ae-46dbc306cbe7\") " pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.277759 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.355587 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1865-account-create-gmczv"] Dec 01 10:23:54 crc kubenswrapper[4958]: W1201 10:23:54.356653 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb5bc0e8_47e9_4229_87df_b605ccc638b1.slice/crio-25f5db4515e086287032a4e64e2c13d523edc7d9b6d7d64b6b3913c2cf356d3c WatchSource:0}: Error finding container 25f5db4515e086287032a4e64e2c13d523edc7d9b6d7d64b6b3913c2cf356d3c: Status 404 returned error can't find the container with id 25f5db4515e086287032a4e64e2c13d523edc7d9b6d7d64b6b3913c2cf356d3c Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.480510 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-59ce-account-create-2skt7"] Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.879660 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6600-account-create-48jzj"] Dec 01 10:23:54 crc kubenswrapper[4958]: W1201 10:23:54.887905 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3fb3017_de2a_4213_b8ae_46dbc306cbe7.slice/crio-7fe15fb37a596b1316f539c9438e1085518985cb193d4eaf1118c9ffe0b3360e WatchSource:0}: Error finding container 7fe15fb37a596b1316f539c9438e1085518985cb193d4eaf1118c9ffe0b3360e: Status 404 returned error can't find the container with id 7fe15fb37a596b1316f539c9438e1085518985cb193d4eaf1118c9ffe0b3360e Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.958972 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0353bee2-4033-4493-9217-b5c4600d3d90","Type":"ContainerStarted","Data":"2f99a0981f6f8c18ad3b38420c231b904981b9a55694320467486987165f628d"} Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.963702 4958 generic.go:334] "Generic (PLEG): container finished" podID="db5bc0e8-47e9-4229-87df-b605ccc638b1" containerID="5ba1d7fe6b72c2cd6f3b2f648908da870b194812c52cf3a428bff1214abe4914" exitCode=0 Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.963810 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1865-account-create-gmczv" event={"ID":"db5bc0e8-47e9-4229-87df-b605ccc638b1","Type":"ContainerDied","Data":"5ba1d7fe6b72c2cd6f3b2f648908da870b194812c52cf3a428bff1214abe4914"} Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.963871 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1865-account-create-gmczv" event={"ID":"db5bc0e8-47e9-4229-87df-b605ccc638b1","Type":"ContainerStarted","Data":"25f5db4515e086287032a4e64e2c13d523edc7d9b6d7d64b6b3913c2cf356d3c"} Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.970268 4958 generic.go:334] "Generic (PLEG): container finished" podID="58c5b092-3de8-439d-a3aa-8aabc962d87a" containerID="10ffeb225a861504e1ad9f727e8c89225a65e1049e0b9c646991e351339be630" exitCode=0 Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.970433 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-59ce-account-create-2skt7" event={"ID":"58c5b092-3de8-439d-a3aa-8aabc962d87a","Type":"ContainerDied","Data":"10ffeb225a861504e1ad9f727e8c89225a65e1049e0b9c646991e351339be630"} Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.970768 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-59ce-account-create-2skt7" event={"ID":"58c5b092-3de8-439d-a3aa-8aabc962d87a","Type":"ContainerStarted","Data":"fee3efcf63607108a54f4495bc3be702687138f4fdc69eb1b66f2e00ac92cc1b"} Dec 01 10:23:54 crc kubenswrapper[4958]: I1201 10:23:54.974495 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6600-account-create-48jzj" event={"ID":"d3fb3017-de2a-4213-b8ae-46dbc306cbe7","Type":"ContainerStarted","Data":"7fe15fb37a596b1316f539c9438e1085518985cb193d4eaf1118c9ffe0b3360e"} Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.000122 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.00009147 podStartE2EDuration="5.00009147s" podCreationTimestamp="2025-12-01 10:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:23:54.985229012 +0000 UTC m=+1482.494018069" watchObservedRunningTime="2025-12-01 10:23:55.00009147 +0000 UTC m=+1482.508880507" Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.460826 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.463548 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.988802 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerStarted","Data":"820be25e9b0afc783b601cab7b34a0623af171f49ab3dc9035de5afe68ec7be6"} Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.989181 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-notification-agent" containerID="cri-o://779133432c09a5ac0a28000e695242c5c1586688638fa77aad08eb7bd341cc55" gracePeriod=30 Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.989178 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="proxy-httpd" containerID="cri-o://820be25e9b0afc783b601cab7b34a0623af171f49ab3dc9035de5afe68ec7be6" gracePeriod=30 Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.989212 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.989158 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="sg-core" containerID="cri-o://a560e0aeda3d3a15ea1ed0b1acd1f4a0c67e1b2e32b5828d9f7223ba1ba083f4" gracePeriod=30 Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.992137 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-central-agent" containerID="cri-o://85ebe13881c71a02a5528b814afad3a47c9155fc0cc046e2c557a5988df56704" gracePeriod=30 Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.999506 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3fb3017-de2a-4213-b8ae-46dbc306cbe7" containerID="077cb1d015bf3a6a753139b2136a56d260fff0902dafa0b2362075ef162e2616" exitCode=0 Dec 01 10:23:55 crc kubenswrapper[4958]: I1201 10:23:55.999812 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6600-account-create-48jzj" event={"ID":"d3fb3017-de2a-4213-b8ae-46dbc306cbe7","Type":"ContainerDied","Data":"077cb1d015bf3a6a753139b2136a56d260fff0902dafa0b2362075ef162e2616"} Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.022186 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.428233952 podStartE2EDuration="8.022144758s" podCreationTimestamp="2025-12-01 10:23:48 +0000 UTC" firstStartedPulling="2025-12-01 10:23:50.009264907 +0000 UTC m=+1477.518053944" lastFinishedPulling="2025-12-01 10:23:55.603175713 +0000 UTC m=+1483.111964750" observedRunningTime="2025-12-01 10:23:56.017975918 +0000 UTC m=+1483.526764955" watchObservedRunningTime="2025-12-01 10:23:56.022144758 +0000 UTC m=+1483.530933795" Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.567234 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.735994 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.752738 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jpbb\" (UniqueName: \"kubernetes.io/projected/58c5b092-3de8-439d-a3aa-8aabc962d87a-kube-api-access-7jpbb\") pod \"58c5b092-3de8-439d-a3aa-8aabc962d87a\" (UID: \"58c5b092-3de8-439d-a3aa-8aabc962d87a\") " Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.763069 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c5b092-3de8-439d-a3aa-8aabc962d87a-kube-api-access-7jpbb" (OuterVolumeSpecName: "kube-api-access-7jpbb") pod "58c5b092-3de8-439d-a3aa-8aabc962d87a" (UID: "58c5b092-3de8-439d-a3aa-8aabc962d87a"). InnerVolumeSpecName "kube-api-access-7jpbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.858327 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvxt7\" (UniqueName: \"kubernetes.io/projected/db5bc0e8-47e9-4229-87df-b605ccc638b1-kube-api-access-hvxt7\") pod \"db5bc0e8-47e9-4229-87df-b605ccc638b1\" (UID: \"db5bc0e8-47e9-4229-87df-b605ccc638b1\") " Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.860145 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jpbb\" (UniqueName: \"kubernetes.io/projected/58c5b092-3de8-439d-a3aa-8aabc962d87a-kube-api-access-7jpbb\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.862930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5bc0e8-47e9-4229-87df-b605ccc638b1-kube-api-access-hvxt7" (OuterVolumeSpecName: "kube-api-access-hvxt7") pod "db5bc0e8-47e9-4229-87df-b605ccc638b1" (UID: "db5bc0e8-47e9-4229-87df-b605ccc638b1"). InnerVolumeSpecName "kube-api-access-hvxt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:56 crc kubenswrapper[4958]: I1201 10:23:56.962321 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvxt7\" (UniqueName: \"kubernetes.io/projected/db5bc0e8-47e9-4229-87df-b605ccc638b1-kube-api-access-hvxt7\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.015404 4958 generic.go:334] "Generic (PLEG): container finished" podID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerID="a560e0aeda3d3a15ea1ed0b1acd1f4a0c67e1b2e32b5828d9f7223ba1ba083f4" exitCode=2 Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.015519 4958 generic.go:334] "Generic (PLEG): container finished" podID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerID="779133432c09a5ac0a28000e695242c5c1586688638fa77aad08eb7bd341cc55" exitCode=0 Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.015591 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerDied","Data":"a560e0aeda3d3a15ea1ed0b1acd1f4a0c67e1b2e32b5828d9f7223ba1ba083f4"} Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.015638 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerDied","Data":"779133432c09a5ac0a28000e695242c5c1586688638fa77aad08eb7bd341cc55"} Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.017864 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1865-account-create-gmczv" event={"ID":"db5bc0e8-47e9-4229-87df-b605ccc638b1","Type":"ContainerDied","Data":"25f5db4515e086287032a4e64e2c13d523edc7d9b6d7d64b6b3913c2cf356d3c"} Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.017893 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f5db4515e086287032a4e64e2c13d523edc7d9b6d7d64b6b3913c2cf356d3c" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.017930 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1865-account-create-gmczv" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.022280 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-59ce-account-create-2skt7" event={"ID":"58c5b092-3de8-439d-a3aa-8aabc962d87a","Type":"ContainerDied","Data":"fee3efcf63607108a54f4495bc3be702687138f4fdc69eb1b66f2e00ac92cc1b"} Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.022348 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee3efcf63607108a54f4495bc3be702687138f4fdc69eb1b66f2e00ac92cc1b" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.022399 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-59ce-account-create-2skt7" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.337009 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.474764 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfg4r\" (UniqueName: \"kubernetes.io/projected/d3fb3017-de2a-4213-b8ae-46dbc306cbe7-kube-api-access-nfg4r\") pod \"d3fb3017-de2a-4213-b8ae-46dbc306cbe7\" (UID: \"d3fb3017-de2a-4213-b8ae-46dbc306cbe7\") " Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.483185 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fb3017-de2a-4213-b8ae-46dbc306cbe7-kube-api-access-nfg4r" (OuterVolumeSpecName: "kube-api-access-nfg4r") pod "d3fb3017-de2a-4213-b8ae-46dbc306cbe7" (UID: "d3fb3017-de2a-4213-b8ae-46dbc306cbe7"). InnerVolumeSpecName "kube-api-access-nfg4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:23:57 crc kubenswrapper[4958]: I1201 10:23:57.577904 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfg4r\" (UniqueName: \"kubernetes.io/projected/d3fb3017-de2a-4213-b8ae-46dbc306cbe7-kube-api-access-nfg4r\") on node \"crc\" DevicePath \"\"" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.038296 4958 generic.go:334] "Generic (PLEG): container finished" podID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerID="85ebe13881c71a02a5528b814afad3a47c9155fc0cc046e2c557a5988df56704" exitCode=0 Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.038382 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerDied","Data":"85ebe13881c71a02a5528b814afad3a47c9155fc0cc046e2c557a5988df56704"} Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.043559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6600-account-create-48jzj" event={"ID":"d3fb3017-de2a-4213-b8ae-46dbc306cbe7","Type":"ContainerDied","Data":"7fe15fb37a596b1316f539c9438e1085518985cb193d4eaf1118c9ffe0b3360e"} Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.043627 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe15fb37a596b1316f539c9438e1085518985cb193d4eaf1118c9ffe0b3360e" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.043700 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6600-account-create-48jzj" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.766009 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7zxqp"] Dec 01 10:23:58 crc kubenswrapper[4958]: E1201 10:23:58.767077 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fb3017-de2a-4213-b8ae-46dbc306cbe7" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.767106 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fb3017-de2a-4213-b8ae-46dbc306cbe7" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: E1201 10:23:58.767129 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5bc0e8-47e9-4229-87df-b605ccc638b1" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.767138 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5bc0e8-47e9-4229-87df-b605ccc638b1" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: E1201 10:23:58.767174 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c5b092-3de8-439d-a3aa-8aabc962d87a" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.767183 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c5b092-3de8-439d-a3aa-8aabc962d87a" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.767455 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fb3017-de2a-4213-b8ae-46dbc306cbe7" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.767483 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c5b092-3de8-439d-a3aa-8aabc962d87a" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.767514 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5bc0e8-47e9-4229-87df-b605ccc638b1" containerName="mariadb-account-create" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.768396 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.772948 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.773362 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.776524 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vshrg" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.778451 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7zxqp"] Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.806015 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.806087 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-scripts\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.806164 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-config-data\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.806361 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqmg2\" (UniqueName: \"kubernetes.io/projected/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-kube-api-access-tqmg2\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.907810 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqmg2\" (UniqueName: \"kubernetes.io/projected/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-kube-api-access-tqmg2\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.907939 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.907992 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-scripts\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.908049 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-config-data\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.914483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-scripts\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.914522 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.915155 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-config-data\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:58 crc kubenswrapper[4958]: I1201 10:23:58.927714 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqmg2\" (UniqueName: \"kubernetes.io/projected/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-kube-api-access-tqmg2\") pod \"nova-cell0-conductor-db-sync-7zxqp\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:59 crc kubenswrapper[4958]: I1201 10:23:59.105204 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:23:59 crc kubenswrapper[4958]: I1201 10:23:59.667970 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7zxqp"] Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.067667 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" event={"ID":"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9","Type":"ContainerStarted","Data":"f8816a8b69f8fafe49bb47e015c918edd0407728acafa294148b64f1c1ac05cb"} Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.172342 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.172948 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.223786 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.236832 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.979824 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:00 crc kubenswrapper[4958]: I1201 10:24:00.979927 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:01 crc kubenswrapper[4958]: I1201 10:24:01.032386 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:01 crc kubenswrapper[4958]: I1201 10:24:01.051499 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:01 crc kubenswrapper[4958]: I1201 10:24:01.103504 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:01 crc kubenswrapper[4958]: I1201 10:24:01.103572 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 10:24:01 crc kubenswrapper[4958]: I1201 10:24:01.103589 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 10:24:01 crc kubenswrapper[4958]: I1201 10:24:01.103599 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.126952 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.127623 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.412114 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.413832 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.910922 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.911514 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:24:03 crc kubenswrapper[4958]: I1201 10:24:03.916125 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 10:24:12 crc kubenswrapper[4958]: I1201 10:24:12.251201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" event={"ID":"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9","Type":"ContainerStarted","Data":"75ef5854c112506bddc38901be3f8f0d961a7821a0620ac683dbab287bec31b2"} Dec 01 10:24:12 crc kubenswrapper[4958]: I1201 10:24:12.280521 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" podStartSLOduration=2.565436094 podStartE2EDuration="14.280494963s" podCreationTimestamp="2025-12-01 10:23:58 +0000 UTC" firstStartedPulling="2025-12-01 10:23:59.677053902 +0000 UTC m=+1487.185842939" lastFinishedPulling="2025-12-01 10:24:11.392112771 +0000 UTC m=+1498.900901808" observedRunningTime="2025-12-01 10:24:12.274232473 +0000 UTC m=+1499.783021510" watchObservedRunningTime="2025-12-01 10:24:12.280494963 +0000 UTC m=+1499.789284000" Dec 01 10:24:19 crc kubenswrapper[4958]: I1201 10:24:19.275303 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 10:24:24 crc kubenswrapper[4958]: I1201 10:24:24.402689 4958 generic.go:334] "Generic (PLEG): container finished" podID="6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" containerID="75ef5854c112506bddc38901be3f8f0d961a7821a0620ac683dbab287bec31b2" exitCode=0 Dec 01 10:24:24 crc kubenswrapper[4958]: I1201 10:24:24.402789 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" event={"ID":"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9","Type":"ContainerDied","Data":"75ef5854c112506bddc38901be3f8f0d961a7821a0620ac683dbab287bec31b2"} Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.830585 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.944580 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-config-data\") pod \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.944890 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-scripts\") pod \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.944921 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-combined-ca-bundle\") pod \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.945050 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqmg2\" (UniqueName: \"kubernetes.io/projected/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-kube-api-access-tqmg2\") pod \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\" (UID: \"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9\") " Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.964764 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-scripts" (OuterVolumeSpecName: "scripts") pod "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" (UID: "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.968314 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-kube-api-access-tqmg2" (OuterVolumeSpecName: "kube-api-access-tqmg2") pod "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" (UID: "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9"). InnerVolumeSpecName "kube-api-access-tqmg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.977818 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" (UID: "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:25 crc kubenswrapper[4958]: I1201 10:24:25.979798 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-config-data" (OuterVolumeSpecName: "config-data") pod "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" (UID: "6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.047337 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqmg2\" (UniqueName: \"kubernetes.io/projected/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-kube-api-access-tqmg2\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.047383 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.047404 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.047415 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.470107 4958 generic.go:334] "Generic (PLEG): container finished" podID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerID="820be25e9b0afc783b601cab7b34a0623af171f49ab3dc9035de5afe68ec7be6" exitCode=137 Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.470336 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerDied","Data":"820be25e9b0afc783b601cab7b34a0623af171f49ab3dc9035de5afe68ec7be6"} Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.477559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" event={"ID":"6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9","Type":"ContainerDied","Data":"f8816a8b69f8fafe49bb47e015c918edd0407728acafa294148b64f1c1ac05cb"} Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.477618 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8816a8b69f8fafe49bb47e015c918edd0407728acafa294148b64f1c1ac05cb" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.477753 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7zxqp" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.624258 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:26 crc kubenswrapper[4958]: E1201 10:24:26.625312 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" containerName="nova-cell0-conductor-db-sync" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.625338 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" containerName="nova-cell0-conductor-db-sync" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.625656 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" containerName="nova-cell0-conductor-db-sync" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.626581 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.629894 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vshrg" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.630167 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.655040 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.771144 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfpf\" (UniqueName: \"kubernetes.io/projected/fb4735ed-5c37-442e-8c66-cab5633a4baa-kube-api-access-9dfpf\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.771244 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.771362 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.795382 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:26 crc kubenswrapper[4958]: E1201 10:24:26.796460 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-9dfpf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell0-conductor-0" podUID="fb4735ed-5c37-442e-8c66-cab5633a4baa" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.873391 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfpf\" (UniqueName: \"kubernetes.io/projected/fb4735ed-5c37-442e-8c66-cab5633a4baa-kube-api-access-9dfpf\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.873462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.873551 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.883887 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.884487 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.895556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfpf\" (UniqueName: \"kubernetes.io/projected/fb4735ed-5c37-442e-8c66-cab5633a4baa-kube-api-access-9dfpf\") pod \"nova-cell0-conductor-0\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:26 crc kubenswrapper[4958]: I1201 10:24:26.981157 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.077858 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-scripts\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.078008 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-combined-ca-bundle\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.078064 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-run-httpd\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.078127 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxz5t\" (UniqueName: \"kubernetes.io/projected/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-kube-api-access-mxz5t\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.078146 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-log-httpd\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.078196 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-config-data\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.078242 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-sg-core-conf-yaml\") pod \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\" (UID: \"5c25f47c-6fde-4589-86ff-f3bb3452e4f0\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.079396 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.080115 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.085329 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-scripts" (OuterVolumeSpecName: "scripts") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.088767 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-kube-api-access-mxz5t" (OuterVolumeSpecName: "kube-api-access-mxz5t") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "kube-api-access-mxz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.114775 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.181355 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.181405 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxz5t\" (UniqueName: \"kubernetes.io/projected/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-kube-api-access-mxz5t\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.181421 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.181433 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.181447 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.194518 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.201073 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-config-data" (OuterVolumeSpecName: "config-data") pod "5c25f47c-6fde-4589-86ff-f3bb3452e4f0" (UID: "5c25f47c-6fde-4589-86ff-f3bb3452e4f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.285788 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.285865 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c25f47c-6fde-4589-86ff-f3bb3452e4f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.493431 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.493459 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5c25f47c-6fde-4589-86ff-f3bb3452e4f0","Type":"ContainerDied","Data":"89bde2c62bd47b45cc7e094321d9e456abd6a7620ed60244f791217c891a3256"} Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.493936 4958 scope.go:117] "RemoveContainer" containerID="820be25e9b0afc783b601cab7b34a0623af171f49ab3dc9035de5afe68ec7be6" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.493483 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.509193 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.526243 4958 scope.go:117] "RemoveContainer" containerID="a560e0aeda3d3a15ea1ed0b1acd1f4a0c67e1b2e32b5828d9f7223ba1ba083f4" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.544754 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.562225 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.562742 4958 scope.go:117] "RemoveContainer" containerID="779133432c09a5ac0a28000e695242c5c1586688638fa77aad08eb7bd341cc55" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.583967 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:27 crc kubenswrapper[4958]: E1201 10:24:27.584620 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-notification-agent" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.584646 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-notification-agent" Dec 01 10:24:27 crc kubenswrapper[4958]: E1201 10:24:27.584708 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="proxy-httpd" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.584719 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="proxy-httpd" Dec 01 10:24:27 crc kubenswrapper[4958]: E1201 10:24:27.584737 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="sg-core" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.584744 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="sg-core" Dec 01 10:24:27 crc kubenswrapper[4958]: E1201 10:24:27.584773 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-central-agent" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.584780 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-central-agent" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.585013 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="proxy-httpd" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.585040 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-central-agent" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.585049 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="sg-core" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.585062 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" containerName="ceilometer-notification-agent" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.587227 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.590184 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.590322 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.593532 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.600532 4958 scope.go:117] "RemoveContainer" containerID="85ebe13881c71a02a5528b814afad3a47c9155fc0cc046e2c557a5988df56704" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.693529 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-combined-ca-bundle\") pod \"fb4735ed-5c37-442e-8c66-cab5633a4baa\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.693610 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-config-data\") pod \"fb4735ed-5c37-442e-8c66-cab5633a4baa\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.693815 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfpf\" (UniqueName: \"kubernetes.io/projected/fb4735ed-5c37-442e-8c66-cab5633a4baa-kube-api-access-9dfpf\") pod \"fb4735ed-5c37-442e-8c66-cab5633a4baa\" (UID: \"fb4735ed-5c37-442e-8c66-cab5633a4baa\") " Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694137 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-log-httpd\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694185 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x99\" (UniqueName: \"kubernetes.io/projected/6ea6f173-ffb8-4cea-b5bf-575413a3f704-kube-api-access-g6x99\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694237 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694291 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-scripts\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694390 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-config-data\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694480 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-run-httpd\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.694546 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.698317 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-config-data" (OuterVolumeSpecName: "config-data") pod "fb4735ed-5c37-442e-8c66-cab5633a4baa" (UID: "fb4735ed-5c37-442e-8c66-cab5633a4baa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.698378 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb4735ed-5c37-442e-8c66-cab5633a4baa" (UID: "fb4735ed-5c37-442e-8c66-cab5633a4baa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.699031 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4735ed-5c37-442e-8c66-cab5633a4baa-kube-api-access-9dfpf" (OuterVolumeSpecName: "kube-api-access-9dfpf") pod "fb4735ed-5c37-442e-8c66-cab5633a4baa" (UID: "fb4735ed-5c37-442e-8c66-cab5633a4baa"). InnerVolumeSpecName "kube-api-access-9dfpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796445 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-config-data\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796571 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-run-httpd\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796625 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796665 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-log-httpd\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796704 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x99\" (UniqueName: \"kubernetes.io/projected/6ea6f173-ffb8-4cea-b5bf-575413a3f704-kube-api-access-g6x99\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796743 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796782 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-scripts\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796901 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796918 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4735ed-5c37-442e-8c66-cab5633a4baa-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.796931 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfpf\" (UniqueName: \"kubernetes.io/projected/fb4735ed-5c37-442e-8c66-cab5633a4baa-kube-api-access-9dfpf\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.799924 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-log-httpd\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.800165 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-run-httpd\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.802397 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.802720 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-scripts\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.803834 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.804555 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-config-data\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.817192 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c25f47c-6fde-4589-86ff-f3bb3452e4f0" path="/var/lib/kubelet/pods/5c25f47c-6fde-4589-86ff-f3bb3452e4f0/volumes" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.819321 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x99\" (UniqueName: \"kubernetes.io/projected/6ea6f173-ffb8-4cea-b5bf-575413a3f704-kube-api-access-g6x99\") pod \"ceilometer-0\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " pod="openstack/ceilometer-0" Dec 01 10:24:27 crc kubenswrapper[4958]: I1201 10:24:27.933506 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.210912 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.211396 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.474539 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.511185 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerStarted","Data":"80c0262eea92f102cb1513260ae6470c91cb6616145857e0ad3d53dbf43a328c"} Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.511223 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.582744 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.599639 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.615195 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.625118 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.626955 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.634107 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vshrg" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.634315 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.635639 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.745728 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.745855 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7sdc\" (UniqueName: \"kubernetes.io/projected/113fc18e-8eb6-45a5-9625-1404f4d832c4-kube-api-access-w7sdc\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.761016 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.862994 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.863103 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7sdc\" (UniqueName: \"kubernetes.io/projected/113fc18e-8eb6-45a5-9625-1404f4d832c4-kube-api-access-w7sdc\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.863159 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.872975 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.884561 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.886778 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7sdc\" (UniqueName: \"kubernetes.io/projected/113fc18e-8eb6-45a5-9625-1404f4d832c4-kube-api-access-w7sdc\") pod \"nova-cell0-conductor-0\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:28 crc kubenswrapper[4958]: I1201 10:24:28.963217 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:29 crc kubenswrapper[4958]: I1201 10:24:29.597766 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerStarted","Data":"5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc"} Dec 01 10:24:29 crc kubenswrapper[4958]: I1201 10:24:29.612392 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:24:29 crc kubenswrapper[4958]: I1201 10:24:29.813511 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4735ed-5c37-442e-8c66-cab5633a4baa" path="/var/lib/kubelet/pods/fb4735ed-5c37-442e-8c66-cab5633a4baa/volumes" Dec 01 10:24:30 crc kubenswrapper[4958]: I1201 10:24:30.609377 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"113fc18e-8eb6-45a5-9625-1404f4d832c4","Type":"ContainerStarted","Data":"973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936"} Dec 01 10:24:30 crc kubenswrapper[4958]: I1201 10:24:30.609897 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"113fc18e-8eb6-45a5-9625-1404f4d832c4","Type":"ContainerStarted","Data":"54a5314fe1f4d099fa8267176306ea6ab531f192e0c2df680daa0e2e98771027"} Dec 01 10:24:30 crc kubenswrapper[4958]: I1201 10:24:30.609939 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:30 crc kubenswrapper[4958]: I1201 10:24:30.611415 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerStarted","Data":"22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed"} Dec 01 10:24:30 crc kubenswrapper[4958]: I1201 10:24:30.640375 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.6403430180000003 podStartE2EDuration="2.640343018s" podCreationTimestamp="2025-12-01 10:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:24:30.625769179 +0000 UTC m=+1518.134558236" watchObservedRunningTime="2025-12-01 10:24:30.640343018 +0000 UTC m=+1518.149132055" Dec 01 10:24:31 crc kubenswrapper[4958]: I1201 10:24:31.626520 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerStarted","Data":"a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6"} Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.651321 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerStarted","Data":"9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629"} Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.652123 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.651966 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="sg-core" containerID="cri-o://a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6" gracePeriod=30 Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.652046 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="proxy-httpd" containerID="cri-o://9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629" gracePeriod=30 Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.651570 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-central-agent" containerID="cri-o://5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc" gracePeriod=30 Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.653797 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-notification-agent" containerID="cri-o://22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed" gracePeriod=30 Dec 01 10:24:33 crc kubenswrapper[4958]: I1201 10:24:33.683524 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.133145561 podStartE2EDuration="6.683493369s" podCreationTimestamp="2025-12-01 10:24:27 +0000 UTC" firstStartedPulling="2025-12-01 10:24:28.484480647 +0000 UTC m=+1515.993269674" lastFinishedPulling="2025-12-01 10:24:33.034828445 +0000 UTC m=+1520.543617482" observedRunningTime="2025-12-01 10:24:33.678611659 +0000 UTC m=+1521.187400696" watchObservedRunningTime="2025-12-01 10:24:33.683493369 +0000 UTC m=+1521.192282446" Dec 01 10:24:34 crc kubenswrapper[4958]: I1201 10:24:34.666897 4958 generic.go:334] "Generic (PLEG): container finished" podID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerID="9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629" exitCode=0 Dec 01 10:24:34 crc kubenswrapper[4958]: I1201 10:24:34.667258 4958 generic.go:334] "Generic (PLEG): container finished" podID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerID="a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6" exitCode=2 Dec 01 10:24:34 crc kubenswrapper[4958]: I1201 10:24:34.667267 4958 generic.go:334] "Generic (PLEG): container finished" podID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerID="22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed" exitCode=0 Dec 01 10:24:34 crc kubenswrapper[4958]: I1201 10:24:34.666985 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerDied","Data":"9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629"} Dec 01 10:24:34 crc kubenswrapper[4958]: I1201 10:24:34.667306 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerDied","Data":"a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6"} Dec 01 10:24:34 crc kubenswrapper[4958]: I1201 10:24:34.667316 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerDied","Data":"22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed"} Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.498774 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.655789 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-log-httpd\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.655977 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-combined-ca-bundle\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.656117 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-scripts\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.656220 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-sg-core-conf-yaml\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.656274 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6x99\" (UniqueName: \"kubernetes.io/projected/6ea6f173-ffb8-4cea-b5bf-575413a3f704-kube-api-access-g6x99\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.656363 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-run-httpd\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.656472 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-config-data\") pod \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\" (UID: \"6ea6f173-ffb8-4cea-b5bf-575413a3f704\") " Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.656512 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.657142 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.658057 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.663882 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-scripts" (OuterVolumeSpecName: "scripts") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.663997 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea6f173-ffb8-4cea-b5bf-575413a3f704-kube-api-access-g6x99" (OuterVolumeSpecName: "kube-api-access-g6x99") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "kube-api-access-g6x99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.687631 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.717485 4958 generic.go:334] "Generic (PLEG): container finished" podID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerID="5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc" exitCode=0 Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.717556 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerDied","Data":"5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc"} Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.717601 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ea6f173-ffb8-4cea-b5bf-575413a3f704","Type":"ContainerDied","Data":"80c0262eea92f102cb1513260ae6470c91cb6616145857e0ad3d53dbf43a328c"} Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.717604 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.717624 4958 scope.go:117] "RemoveContainer" containerID="9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.746117 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.760596 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.760693 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.760706 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.760757 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6x99\" (UniqueName: \"kubernetes.io/projected/6ea6f173-ffb8-4cea-b5bf-575413a3f704-kube-api-access-g6x99\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.760768 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ea6f173-ffb8-4cea-b5bf-575413a3f704-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.774210 4958 scope.go:117] "RemoveContainer" containerID="a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.781104 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-config-data" (OuterVolumeSpecName: "config-data") pod "6ea6f173-ffb8-4cea-b5bf-575413a3f704" (UID: "6ea6f173-ffb8-4cea-b5bf-575413a3f704"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.811309 4958 scope.go:117] "RemoveContainer" containerID="22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.838367 4958 scope.go:117] "RemoveContainer" containerID="5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.862773 4958 scope.go:117] "RemoveContainer" containerID="9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629" Dec 01 10:24:38 crc kubenswrapper[4958]: E1201 10:24:38.863488 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629\": container with ID starting with 9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629 not found: ID does not exist" containerID="9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.863537 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629"} err="failed to get container status \"9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629\": rpc error: code = NotFound desc = could not find container \"9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629\": container with ID starting with 9d7fc47297382cac7f9e705ba786a4aa847451a83aa7544bf376c2064c49b629 not found: ID does not exist" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.863574 4958 scope.go:117] "RemoveContainer" containerID="a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6" Dec 01 10:24:38 crc kubenswrapper[4958]: E1201 10:24:38.864006 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6\": container with ID starting with a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6 not found: ID does not exist" containerID="a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.864049 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6"} err="failed to get container status \"a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6\": rpc error: code = NotFound desc = could not find container \"a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6\": container with ID starting with a32682775a8a2647e8441878c8515e0bf30ce608ba7d86ae93501ca183af93c6 not found: ID does not exist" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.864074 4958 scope.go:117] "RemoveContainer" containerID="22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.864503 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea6f173-ffb8-4cea-b5bf-575413a3f704-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:38 crc kubenswrapper[4958]: E1201 10:24:38.864872 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed\": container with ID starting with 22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed not found: ID does not exist" containerID="22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.864983 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed"} err="failed to get container status \"22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed\": rpc error: code = NotFound desc = could not find container \"22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed\": container with ID starting with 22002cf7fea29912ebf6602e72cdc2dab0a7b96f92bca33c04322808a929c7ed not found: ID does not exist" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.865074 4958 scope.go:117] "RemoveContainer" containerID="5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc" Dec 01 10:24:38 crc kubenswrapper[4958]: E1201 10:24:38.865727 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc\": container with ID starting with 5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc not found: ID does not exist" containerID="5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.865833 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc"} err="failed to get container status \"5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc\": rpc error: code = NotFound desc = could not find container \"5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc\": container with ID starting with 5eee7d5a8d538d76ea32e931ea9993a8720db525b7b22bd82bc0c2f3944998bc not found: ID does not exist" Dec 01 10:24:38 crc kubenswrapper[4958]: I1201 10:24:38.998558 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.064660 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.077074 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.107103 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: E1201 10:24:39.107744 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="sg-core" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.107774 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="sg-core" Dec 01 10:24:39 crc kubenswrapper[4958]: E1201 10:24:39.107828 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="proxy-httpd" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.107837 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="proxy-httpd" Dec 01 10:24:39 crc kubenswrapper[4958]: E1201 10:24:39.107874 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-central-agent" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.107885 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-central-agent" Dec 01 10:24:39 crc kubenswrapper[4958]: E1201 10:24:39.107909 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-notification-agent" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.107918 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-notification-agent" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.108165 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="proxy-httpd" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.108193 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-notification-agent" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.108226 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="ceilometer-central-agent" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.108234 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" containerName="sg-core" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.111769 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.114726 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.117432 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.133914 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.275834 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-scripts\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.275954 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-log-httpd\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.275999 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-run-httpd\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.276041 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr6p8\" (UniqueName: \"kubernetes.io/projected/32b48e0e-eb10-4c19-a505-e82b2617fd54-kube-api-access-nr6p8\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.276066 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-config-data\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.276204 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.276554 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379205 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr6p8\" (UniqueName: \"kubernetes.io/projected/32b48e0e-eb10-4c19-a505-e82b2617fd54-kube-api-access-nr6p8\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379276 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-config-data\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379336 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379436 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379555 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-scripts\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379630 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-log-httpd\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.379697 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-run-httpd\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.381163 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-log-httpd\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.381242 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-run-httpd\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.388029 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-config-data\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.397783 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-scripts\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.399339 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.400019 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.401560 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr6p8\" (UniqueName: \"kubernetes.io/projected/32b48e0e-eb10-4c19-a505-e82b2617fd54-kube-api-access-nr6p8\") pod \"ceilometer-0\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.434755 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.713503 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-2f5lj"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.738105 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2f5lj"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.738276 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.746059 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.753488 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.852600 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea6f173-ffb8-4cea-b5bf-575413a3f704" path="/var/lib/kubelet/pods/6ea6f173-ffb8-4cea-b5bf-575413a3f704/volumes" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.856309 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.858469 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.864936 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.873011 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.912758 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.914497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-config-data\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.914522 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg5rc\" (UniqueName: \"kubernetes.io/projected/2433ef78-8e78-439b-a64c-1052adbeed62-kube-api-access-sg5rc\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.914563 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-scripts\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.987414 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.989523 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:24:39 crc kubenswrapper[4958]: I1201 10:24:39.996953 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017116 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-scripts\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017608 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017640 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-config-data\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017669 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12de1e79-7da7-44be-aa4e-737239095c7e-logs\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017712 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsn6\" (UniqueName: \"kubernetes.io/projected/12de1e79-7da7-44be-aa4e-737239095c7e-kube-api-access-zpsn6\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017769 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017832 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-config-data\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.017868 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg5rc\" (UniqueName: \"kubernetes.io/projected/2433ef78-8e78-439b-a64c-1052adbeed62-kube-api-access-sg5rc\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.030947 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.034440 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-config-data\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.039681 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.061478 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-scripts\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.114403 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg5rc\" (UniqueName: \"kubernetes.io/projected/2433ef78-8e78-439b-a64c-1052adbeed62-kube-api-access-sg5rc\") pod \"nova-cell0-cell-mapping-2f5lj\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpdk\" (UniqueName: \"kubernetes.io/projected/1dd642ce-2ddf-4d01-893a-09090c99a729-kube-api-access-fhpdk\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126490 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-config-data\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126530 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-config-data\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126563 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12de1e79-7da7-44be-aa4e-737239095c7e-logs\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126609 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsn6\" (UniqueName: \"kubernetes.io/projected/12de1e79-7da7-44be-aa4e-737239095c7e-kube-api-access-zpsn6\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.126674 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.131952 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12de1e79-7da7-44be-aa4e-737239095c7e-logs\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.130301 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.134440 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.142821 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.147273 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.148164 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-config-data\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.174425 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.191004 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsn6\" (UniqueName: \"kubernetes.io/projected/12de1e79-7da7-44be-aa4e-737239095c7e-kube-api-access-zpsn6\") pod \"nova-api-0\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.215649 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.217313 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.254552 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.254932 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-config-data\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.255093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpdk\" (UniqueName: \"kubernetes.io/projected/1dd642ce-2ddf-4d01-893a-09090c99a729-kube-api-access-fhpdk\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.255174 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-config-data\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.255207 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.255247 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e96a15-eefc-47b6-98c0-60c772de185e-logs\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.255401 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhqf\" (UniqueName: \"kubernetes.io/projected/c7e96a15-eefc-47b6-98c0-60c772de185e-kube-api-access-6nhqf\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.255520 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.262931 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.266575 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.270752 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-config-data\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.274452 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.314427 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpdk\" (UniqueName: \"kubernetes.io/projected/1dd642ce-2ddf-4d01-893a-09090c99a729-kube-api-access-fhpdk\") pod \"nova-scheduler-0\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.320904 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.325424 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.350303 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-8g67m"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.361302 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.361987 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-config-data\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.362119 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.362277 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.362340 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.362374 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e96a15-eefc-47b6-98c0-60c772de185e-logs\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.362448 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhqf\" (UniqueName: \"kubernetes.io/projected/c7e96a15-eefc-47b6-98c0-60c772de185e-kube-api-access-6nhqf\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.362508 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kvsf\" (UniqueName: \"kubernetes.io/projected/6932e428-bf22-45fe-a500-f8082c039d0b-kube-api-access-7kvsf\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.363276 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-8g67m"] Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.364466 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e96a15-eefc-47b6-98c0-60c772de185e-logs\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.377235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-config-data\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.395054 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.395461 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhqf\" (UniqueName: \"kubernetes.io/projected/c7e96a15-eefc-47b6-98c0-60c772de185e-kube-api-access-6nhqf\") pod \"nova-metadata-0\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.396784 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.467485 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-config\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.467931 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.467958 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.468006 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kvsf\" (UniqueName: \"kubernetes.io/projected/6932e428-bf22-45fe-a500-f8082c039d0b-kube-api-access-7kvsf\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.468039 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-svc\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.468097 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.468145 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9h8\" (UniqueName: \"kubernetes.io/projected/67eb34e2-8e1c-4adf-a952-59028475c264-kube-api-access-9p9h8\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.468187 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.468210 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.477228 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.477833 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.490396 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.491113 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kvsf\" (UniqueName: \"kubernetes.io/projected/6932e428-bf22-45fe-a500-f8082c039d0b-kube-api-access-7kvsf\") pod \"nova-cell1-novncproxy-0\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.570024 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.570102 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9h8\" (UniqueName: \"kubernetes.io/projected/67eb34e2-8e1c-4adf-a952-59028475c264-kube-api-access-9p9h8\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.570337 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-config\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.570383 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.570413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.570478 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-svc\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.571394 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.571465 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-svc\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.574236 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.574840 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.576472 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-config\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.599409 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9h8\" (UniqueName: \"kubernetes.io/projected/67eb34e2-8e1c-4adf-a952-59028475c264-kube-api-access-9p9h8\") pod \"dnsmasq-dns-757b4f8459-8g67m\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.655156 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.669320 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.710913 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:40 crc kubenswrapper[4958]: I1201 10:24:40.829336 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerStarted","Data":"6045dab28055a78678c4b155e57051d084d7b372bb440e1ae1736e4a319ccf12"} Dec 01 10:24:41 crc kubenswrapper[4958]: W1201 10:24:41.077210 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dd642ce_2ddf_4d01_893a_09090c99a729.slice/crio-ad4d869d0c851647c0e5993938f36caea022bceba4870a7a43a45d258d6e2957 WatchSource:0}: Error finding container ad4d869d0c851647c0e5993938f36caea022bceba4870a7a43a45d258d6e2957: Status 404 returned error can't find the container with id ad4d869d0c851647c0e5993938f36caea022bceba4870a7a43a45d258d6e2957 Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.077892 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.190145 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7tf7x"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.192506 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.197333 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.197890 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.246302 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7tf7x"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.294794 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-config-data\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.294901 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.295024 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-scripts\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.295073 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjhn7\" (UniqueName: \"kubernetes.io/projected/d577a45a-466a-4ea0-925f-37f3946eea80-kube-api-access-cjhn7\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.352736 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2f5lj"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.383123 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.401686 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-config-data\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.401775 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.402074 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-scripts\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.402169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjhn7\" (UniqueName: \"kubernetes.io/projected/d577a45a-466a-4ea0-925f-37f3946eea80-kube-api-access-cjhn7\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.411802 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-scripts\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.412518 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.413197 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-config-data\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.439349 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjhn7\" (UniqueName: \"kubernetes.io/projected/d577a45a-466a-4ea0-925f-37f3946eea80-kube-api-access-cjhn7\") pod \"nova-cell1-conductor-db-sync-7tf7x\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.659302 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.762314 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.868830 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.890226 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-8g67m"] Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.894963 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e96a15-eefc-47b6-98c0-60c772de185e","Type":"ContainerStarted","Data":"b7349f6fee1fd1189b6ff92e78acfb512bb3fa1451213bc33405dcc380b00b6c"} Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.898313 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1dd642ce-2ddf-4d01-893a-09090c99a729","Type":"ContainerStarted","Data":"ad4d869d0c851647c0e5993938f36caea022bceba4870a7a43a45d258d6e2957"} Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.900826 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2f5lj" event={"ID":"2433ef78-8e78-439b-a64c-1052adbeed62","Type":"ContainerStarted","Data":"cb3b28d88806013524616257d71866654def43a83ff6b5fb4703f25b4ddb231c"} Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.902281 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" event={"ID":"67eb34e2-8e1c-4adf-a952-59028475c264","Type":"ContainerStarted","Data":"85d850ca45510904c96aeebfb708316f04b09e9434e81a8f98297d9fae7ac68e"} Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.915886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932e428-bf22-45fe-a500-f8082c039d0b","Type":"ContainerStarted","Data":"b2d62874d2335e8ded1c5964a3e468128b2f56fa0e2616a9e357974bc715e119"} Dec 01 10:24:41 crc kubenswrapper[4958]: I1201 10:24:41.925181 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12de1e79-7da7-44be-aa4e-737239095c7e","Type":"ContainerStarted","Data":"4bee92437b17189520792b681f21f8cef838a083c940c5d4ded55229f846de64"} Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.285432 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7tf7x"] Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.942393 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" event={"ID":"d577a45a-466a-4ea0-925f-37f3946eea80","Type":"ContainerStarted","Data":"63c03bfaba87edf547b131ec62039fd6a3dce67973e3a6dd92756348441874d8"} Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.943121 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" event={"ID":"d577a45a-466a-4ea0-925f-37f3946eea80","Type":"ContainerStarted","Data":"065effe85a0fcceea0f88296e6d17c4e36a22614c519574adbeed33e30665f67"} Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.955429 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerStarted","Data":"cd451f35975b7fb664d8eb29a07761f86a7c8fca0866c7b22c82a51901e4f21f"} Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.958436 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2f5lj" event={"ID":"2433ef78-8e78-439b-a64c-1052adbeed62","Type":"ContainerStarted","Data":"0c45fd6464c1e425a10c1b58a4752dccdff825de5fe5bc44db608fe475975eb9"} Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.968239 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" podStartSLOduration=1.968213731 podStartE2EDuration="1.968213731s" podCreationTimestamp="2025-12-01 10:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:24:42.962273421 +0000 UTC m=+1530.471062458" watchObservedRunningTime="2025-12-01 10:24:42.968213731 +0000 UTC m=+1530.477002768" Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.971037 4958 generic.go:334] "Generic (PLEG): container finished" podID="67eb34e2-8e1c-4adf-a952-59028475c264" containerID="634a27e51b8589382e9f7cf2d5851095850c183e089b20da7ea3ab2f630172e7" exitCode=0 Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.971120 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" event={"ID":"67eb34e2-8e1c-4adf-a952-59028475c264","Type":"ContainerDied","Data":"634a27e51b8589382e9f7cf2d5851095850c183e089b20da7ea3ab2f630172e7"} Dec 01 10:24:42 crc kubenswrapper[4958]: I1201 10:24:42.989721 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-2f5lj" podStartSLOduration=3.989694279 podStartE2EDuration="3.989694279s" podCreationTimestamp="2025-12-01 10:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:24:42.978832826 +0000 UTC m=+1530.487621863" watchObservedRunningTime="2025-12-01 10:24:42.989694279 +0000 UTC m=+1530.498483316" Dec 01 10:24:44 crc kubenswrapper[4958]: I1201 10:24:43.999882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" event={"ID":"67eb34e2-8e1c-4adf-a952-59028475c264","Type":"ContainerStarted","Data":"71c9e0138da3d2aec6aa6687165d5da89aba3185c7b7522e5005093c2d2518b5"} Dec 01 10:24:44 crc kubenswrapper[4958]: I1201 10:24:44.002541 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:44 crc kubenswrapper[4958]: I1201 10:24:44.007661 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerStarted","Data":"c261922366b23228b7baa6538188b1b14bab1b177143b266dd74c203af62785c"} Dec 01 10:24:44 crc kubenswrapper[4958]: I1201 10:24:44.034048 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" podStartSLOduration=4.034013597 podStartE2EDuration="4.034013597s" podCreationTimestamp="2025-12-01 10:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:24:44.028427907 +0000 UTC m=+1531.537216944" watchObservedRunningTime="2025-12-01 10:24:44.034013597 +0000 UTC m=+1531.542802634" Dec 01 10:24:44 crc kubenswrapper[4958]: I1201 10:24:44.356949 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:24:44 crc kubenswrapper[4958]: I1201 10:24:44.363939 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.058673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932e428-bf22-45fe-a500-f8082c039d0b","Type":"ContainerStarted","Data":"624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6"} Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.059327 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6932e428-bf22-45fe-a500-f8082c039d0b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6" gracePeriod=30 Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.064583 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12de1e79-7da7-44be-aa4e-737239095c7e","Type":"ContainerStarted","Data":"76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018"} Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.064631 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12de1e79-7da7-44be-aa4e-737239095c7e","Type":"ContainerStarted","Data":"7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb"} Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.070476 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerStarted","Data":"9b6b6da1cec09efa51d0d84e5ac7405e757de5847aa50bc34a6ac6f39b079740"} Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.084946 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e96a15-eefc-47b6-98c0-60c772de185e","Type":"ContainerStarted","Data":"7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d"} Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.085228 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-log" containerID="cri-o://7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d" gracePeriod=30 Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.085650 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-metadata" containerID="cri-o://a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450" gracePeriod=30 Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.095470 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1dd642ce-2ddf-4d01-893a-09090c99a729","Type":"ContainerStarted","Data":"56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2"} Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.120788 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.550863552 podStartE2EDuration="7.120753203s" podCreationTimestamp="2025-12-01 10:24:40 +0000 UTC" firstStartedPulling="2025-12-01 10:24:41.777447899 +0000 UTC m=+1529.286236936" lastFinishedPulling="2025-12-01 10:24:46.34733756 +0000 UTC m=+1533.856126587" observedRunningTime="2025-12-01 10:24:47.083089809 +0000 UTC m=+1534.591878846" watchObservedRunningTime="2025-12-01 10:24:47.120753203 +0000 UTC m=+1534.629542240" Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.131661 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.106416564 podStartE2EDuration="8.131633366s" podCreationTimestamp="2025-12-01 10:24:39 +0000 UTC" firstStartedPulling="2025-12-01 10:24:41.352320637 +0000 UTC m=+1528.861109674" lastFinishedPulling="2025-12-01 10:24:46.377537439 +0000 UTC m=+1533.886326476" observedRunningTime="2025-12-01 10:24:47.113999079 +0000 UTC m=+1534.622788116" watchObservedRunningTime="2025-12-01 10:24:47.131633366 +0000 UTC m=+1534.640422403" Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.151703 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.559594514 podStartE2EDuration="7.151675913s" podCreationTimestamp="2025-12-01 10:24:40 +0000 UTC" firstStartedPulling="2025-12-01 10:24:41.784174453 +0000 UTC m=+1529.292963490" lastFinishedPulling="2025-12-01 10:24:46.376255852 +0000 UTC m=+1533.885044889" observedRunningTime="2025-12-01 10:24:47.149773608 +0000 UTC m=+1534.658562645" watchObservedRunningTime="2025-12-01 10:24:47.151675913 +0000 UTC m=+1534.660464950" Dec 01 10:24:47 crc kubenswrapper[4958]: I1201 10:24:47.181593 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.890238223 podStartE2EDuration="8.181566443s" podCreationTimestamp="2025-12-01 10:24:39 +0000 UTC" firstStartedPulling="2025-12-01 10:24:41.078838047 +0000 UTC m=+1528.587627084" lastFinishedPulling="2025-12-01 10:24:46.370166267 +0000 UTC m=+1533.878955304" observedRunningTime="2025-12-01 10:24:47.174277623 +0000 UTC m=+1534.683066660" watchObservedRunningTime="2025-12-01 10:24:47.181566443 +0000 UTC m=+1534.690355480" Dec 01 10:24:48 crc kubenswrapper[4958]: I1201 10:24:48.110071 4958 generic.go:334] "Generic (PLEG): container finished" podID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerID="7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d" exitCode=143 Dec 01 10:24:48 crc kubenswrapper[4958]: I1201 10:24:48.110263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e96a15-eefc-47b6-98c0-60c772de185e","Type":"ContainerStarted","Data":"a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450"} Dec 01 10:24:48 crc kubenswrapper[4958]: I1201 10:24:48.110469 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e96a15-eefc-47b6-98c0-60c772de185e","Type":"ContainerDied","Data":"7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d"} Dec 01 10:24:49 crc kubenswrapper[4958]: I1201 10:24:49.126650 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerStarted","Data":"e99a3e70e39d99263831941f6cce24905ce5e58ef2d76e0a70b8f07c7dc4b74a"} Dec 01 10:24:49 crc kubenswrapper[4958]: I1201 10:24:49.128026 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:24:49 crc kubenswrapper[4958]: I1201 10:24:49.163797 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5248304409999998 podStartE2EDuration="10.163761138s" podCreationTimestamp="2025-12-01 10:24:39 +0000 UTC" firstStartedPulling="2025-12-01 10:24:40.216991359 +0000 UTC m=+1527.725780396" lastFinishedPulling="2025-12-01 10:24:47.855922056 +0000 UTC m=+1535.364711093" observedRunningTime="2025-12-01 10:24:49.160110023 +0000 UTC m=+1536.668899060" watchObservedRunningTime="2025-12-01 10:24:49.163761138 +0000 UTC m=+1536.672550175" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.331272 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.334241 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.366878 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.492282 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.492374 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.656821 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.656914 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.671247 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.714057 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.786630 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v4zqt"] Dec 01 10:24:50 crc kubenswrapper[4958]: I1201 10:24:50.786960 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerName="dnsmasq-dns" containerID="cri-o://9cd77135c9dae4d1d8605178d1a6e703cb37d33184ff45251860dbf6725f2a91" gracePeriod=10 Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.171763 4958 generic.go:334] "Generic (PLEG): container finished" podID="d577a45a-466a-4ea0-925f-37f3946eea80" containerID="63c03bfaba87edf547b131ec62039fd6a3dce67973e3a6dd92756348441874d8" exitCode=0 Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.172656 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" event={"ID":"d577a45a-466a-4ea0-925f-37f3946eea80","Type":"ContainerDied","Data":"63c03bfaba87edf547b131ec62039fd6a3dce67973e3a6dd92756348441874d8"} Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.177440 4958 generic.go:334] "Generic (PLEG): container finished" podID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerID="9cd77135c9dae4d1d8605178d1a6e703cb37d33184ff45251860dbf6725f2a91" exitCode=0 Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.177837 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" event={"ID":"7e2e8cef-b53d-4690-8d80-0a05cc2f3905","Type":"ContainerDied","Data":"9cd77135c9dae4d1d8605178d1a6e703cb37d33184ff45251860dbf6725f2a91"} Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.236004 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.397966 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.453637 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-swift-storage-0\") pod \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.453693 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-nb\") pod \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.453958 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxjb7\" (UniqueName: \"kubernetes.io/projected/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-kube-api-access-nxjb7\") pod \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.454043 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-config\") pod \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.454220 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-svc\") pod \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.454259 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-sb\") pod \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\" (UID: \"7e2e8cef-b53d-4690-8d80-0a05cc2f3905\") " Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.466572 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-kube-api-access-nxjb7" (OuterVolumeSpecName: "kube-api-access-nxjb7") pod "7e2e8cef-b53d-4690-8d80-0a05cc2f3905" (UID: "7e2e8cef-b53d-4690-8d80-0a05cc2f3905"). InnerVolumeSpecName "kube-api-access-nxjb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.556129 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxjb7\" (UniqueName: \"kubernetes.io/projected/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-kube-api-access-nxjb7\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.571312 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e2e8cef-b53d-4690-8d80-0a05cc2f3905" (UID: "7e2e8cef-b53d-4690-8d80-0a05cc2f3905"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.576534 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.577104 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.582666 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e2e8cef-b53d-4690-8d80-0a05cc2f3905" (UID: "7e2e8cef-b53d-4690-8d80-0a05cc2f3905"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.592394 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-config" (OuterVolumeSpecName: "config") pod "7e2e8cef-b53d-4690-8d80-0a05cc2f3905" (UID: "7e2e8cef-b53d-4690-8d80-0a05cc2f3905"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.595701 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e2e8cef-b53d-4690-8d80-0a05cc2f3905" (UID: "7e2e8cef-b53d-4690-8d80-0a05cc2f3905"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.607481 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e2e8cef-b53d-4690-8d80-0a05cc2f3905" (UID: "7e2e8cef-b53d-4690-8d80-0a05cc2f3905"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.658920 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.659259 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.659329 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.659395 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:51 crc kubenswrapper[4958]: I1201 10:24:51.659522 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e2e8cef-b53d-4690-8d80-0a05cc2f3905-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.190761 4958 generic.go:334] "Generic (PLEG): container finished" podID="2433ef78-8e78-439b-a64c-1052adbeed62" containerID="0c45fd6464c1e425a10c1b58a4752dccdff825de5fe5bc44db608fe475975eb9" exitCode=0 Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.190875 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2f5lj" event={"ID":"2433ef78-8e78-439b-a64c-1052adbeed62","Type":"ContainerDied","Data":"0c45fd6464c1e425a10c1b58a4752dccdff825de5fe5bc44db608fe475975eb9"} Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.194348 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" event={"ID":"7e2e8cef-b53d-4690-8d80-0a05cc2f3905","Type":"ContainerDied","Data":"0da1c7417112852f1973f939b9d47f642ec869749bae54733c95ea7d761fc752"} Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.194499 4958 scope.go:117] "RemoveContainer" containerID="9cd77135c9dae4d1d8605178d1a6e703cb37d33184ff45251860dbf6725f2a91" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.194431 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v4zqt" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.248156 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v4zqt"] Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.249143 4958 scope.go:117] "RemoveContainer" containerID="69ca25c6ecb718cd720794af4ec2982911438faa56d74bf694f4173e0735c7c4" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.260400 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v4zqt"] Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.610491 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.683601 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-scripts\") pod \"d577a45a-466a-4ea0-925f-37f3946eea80\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.683758 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-combined-ca-bundle\") pod \"d577a45a-466a-4ea0-925f-37f3946eea80\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.683805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjhn7\" (UniqueName: \"kubernetes.io/projected/d577a45a-466a-4ea0-925f-37f3946eea80-kube-api-access-cjhn7\") pod \"d577a45a-466a-4ea0-925f-37f3946eea80\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.683886 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-config-data\") pod \"d577a45a-466a-4ea0-925f-37f3946eea80\" (UID: \"d577a45a-466a-4ea0-925f-37f3946eea80\") " Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.690008 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d577a45a-466a-4ea0-925f-37f3946eea80-kube-api-access-cjhn7" (OuterVolumeSpecName: "kube-api-access-cjhn7") pod "d577a45a-466a-4ea0-925f-37f3946eea80" (UID: "d577a45a-466a-4ea0-925f-37f3946eea80"). InnerVolumeSpecName "kube-api-access-cjhn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.691731 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-scripts" (OuterVolumeSpecName: "scripts") pod "d577a45a-466a-4ea0-925f-37f3946eea80" (UID: "d577a45a-466a-4ea0-925f-37f3946eea80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.718748 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d577a45a-466a-4ea0-925f-37f3946eea80" (UID: "d577a45a-466a-4ea0-925f-37f3946eea80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.719306 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-config-data" (OuterVolumeSpecName: "config-data") pod "d577a45a-466a-4ea0-925f-37f3946eea80" (UID: "d577a45a-466a-4ea0-925f-37f3946eea80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.788645 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.788697 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.788713 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d577a45a-466a-4ea0-925f-37f3946eea80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:52 crc kubenswrapper[4958]: I1201 10:24:52.788730 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjhn7\" (UniqueName: \"kubernetes.io/projected/d577a45a-466a-4ea0-925f-37f3946eea80-kube-api-access-cjhn7\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.233147 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.236173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7tf7x" event={"ID":"d577a45a-466a-4ea0-925f-37f3946eea80","Type":"ContainerDied","Data":"065effe85a0fcceea0f88296e6d17c4e36a22614c519574adbeed33e30665f67"} Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.236243 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="065effe85a0fcceea0f88296e6d17c4e36a22614c519574adbeed33e30665f67" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.327079 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:24:53 crc kubenswrapper[4958]: E1201 10:24:53.328364 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerName="init" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.328387 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerName="init" Dec 01 10:24:53 crc kubenswrapper[4958]: E1201 10:24:53.328402 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d577a45a-466a-4ea0-925f-37f3946eea80" containerName="nova-cell1-conductor-db-sync" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.328410 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d577a45a-466a-4ea0-925f-37f3946eea80" containerName="nova-cell1-conductor-db-sync" Dec 01 10:24:53 crc kubenswrapper[4958]: E1201 10:24:53.328468 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerName="dnsmasq-dns" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.328479 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerName="dnsmasq-dns" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.328800 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" containerName="dnsmasq-dns" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.328873 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d577a45a-466a-4ea0-925f-37f3946eea80" containerName="nova-cell1-conductor-db-sync" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.330190 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.336430 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.346001 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.400940 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.401044 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.401156 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qhrx\" (UniqueName: \"kubernetes.io/projected/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-kube-api-access-2qhrx\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.507394 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qhrx\" (UniqueName: \"kubernetes.io/projected/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-kube-api-access-2qhrx\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.507487 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.507550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.516936 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.517502 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.532910 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qhrx\" (UniqueName: \"kubernetes.io/projected/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-kube-api-access-2qhrx\") pod \"nova-cell1-conductor-0\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.664315 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.759656 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.912629 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg5rc\" (UniqueName: \"kubernetes.io/projected/2433ef78-8e78-439b-a64c-1052adbeed62-kube-api-access-sg5rc\") pod \"2433ef78-8e78-439b-a64c-1052adbeed62\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.912683 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-config-data\") pod \"2433ef78-8e78-439b-a64c-1052adbeed62\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.912952 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-scripts\") pod \"2433ef78-8e78-439b-a64c-1052adbeed62\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.912997 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-combined-ca-bundle\") pod \"2433ef78-8e78-439b-a64c-1052adbeed62\" (UID: \"2433ef78-8e78-439b-a64c-1052adbeed62\") " Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.931422 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2e8cef-b53d-4690-8d80-0a05cc2f3905" path="/var/lib/kubelet/pods/7e2e8cef-b53d-4690-8d80-0a05cc2f3905/volumes" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.951659 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-scripts" (OuterVolumeSpecName: "scripts") pod "2433ef78-8e78-439b-a64c-1052adbeed62" (UID: "2433ef78-8e78-439b-a64c-1052adbeed62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.952738 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2433ef78-8e78-439b-a64c-1052adbeed62-kube-api-access-sg5rc" (OuterVolumeSpecName: "kube-api-access-sg5rc") pod "2433ef78-8e78-439b-a64c-1052adbeed62" (UID: "2433ef78-8e78-439b-a64c-1052adbeed62"). InnerVolumeSpecName "kube-api-access-sg5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.971909 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2433ef78-8e78-439b-a64c-1052adbeed62" (UID: "2433ef78-8e78-439b-a64c-1052adbeed62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:53 crc kubenswrapper[4958]: I1201 10:24:53.979628 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-config-data" (OuterVolumeSpecName: "config-data") pod "2433ef78-8e78-439b-a64c-1052adbeed62" (UID: "2433ef78-8e78-439b-a64c-1052adbeed62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.016297 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.017155 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.017223 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg5rc\" (UniqueName: \"kubernetes.io/projected/2433ef78-8e78-439b-a64c-1052adbeed62-kube-api-access-sg5rc\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.017247 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2433ef78-8e78-439b-a64c-1052adbeed62-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.255098 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2f5lj" event={"ID":"2433ef78-8e78-439b-a64c-1052adbeed62","Type":"ContainerDied","Data":"cb3b28d88806013524616257d71866654def43a83ff6b5fb4703f25b4ddb231c"} Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.255497 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3b28d88806013524616257d71866654def43a83ff6b5fb4703f25b4ddb231c" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.255584 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2f5lj" Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.315823 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.434569 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.435104 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1dd642ce-2ddf-4d01-893a-09090c99a729" containerName="nova-scheduler-scheduler" containerID="cri-o://56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2" gracePeriod=30 Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.509412 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.509707 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-log" containerID="cri-o://7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb" gracePeriod=30 Dec 01 10:24:54 crc kubenswrapper[4958]: I1201 10:24:54.509837 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-api" containerID="cri-o://76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018" gracePeriod=30 Dec 01 10:24:55 crc kubenswrapper[4958]: I1201 10:24:55.271341 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6","Type":"ContainerStarted","Data":"b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b"} Dec 01 10:24:55 crc kubenswrapper[4958]: I1201 10:24:55.271746 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 10:24:55 crc kubenswrapper[4958]: I1201 10:24:55.271764 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6","Type":"ContainerStarted","Data":"d882b86fe2faf9de13ad90313437bba40d0c05bb0859e452a8ced38bfb65f447"} Dec 01 10:24:55 crc kubenswrapper[4958]: I1201 10:24:55.274990 4958 generic.go:334] "Generic (PLEG): container finished" podID="12de1e79-7da7-44be-aa4e-737239095c7e" containerID="7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb" exitCode=143 Dec 01 10:24:55 crc kubenswrapper[4958]: I1201 10:24:55.275043 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12de1e79-7da7-44be-aa4e-737239095c7e","Type":"ContainerDied","Data":"7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb"} Dec 01 10:24:55 crc kubenswrapper[4958]: I1201 10:24:55.303608 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.303589611 podStartE2EDuration="2.303589611s" podCreationTimestamp="2025-12-01 10:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:24:55.300510452 +0000 UTC m=+1542.809299489" watchObservedRunningTime="2025-12-01 10:24:55.303589611 +0000 UTC m=+1542.812378648" Dec 01 10:24:55 crc kubenswrapper[4958]: E1201 10:24:55.334920 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:24:55 crc kubenswrapper[4958]: E1201 10:24:55.340970 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:24:55 crc kubenswrapper[4958]: E1201 10:24:55.343131 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:24:55 crc kubenswrapper[4958]: E1201 10:24:55.343236 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1dd642ce-2ddf-4d01-893a-09090c99a729" containerName="nova-scheduler-scheduler" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.195180 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.210922 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.211010 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.232701 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-combined-ca-bundle\") pod \"12de1e79-7da7-44be-aa4e-737239095c7e\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.232969 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-config-data\") pod \"12de1e79-7da7-44be-aa4e-737239095c7e\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.233093 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpsn6\" (UniqueName: \"kubernetes.io/projected/12de1e79-7da7-44be-aa4e-737239095c7e-kube-api-access-zpsn6\") pod \"12de1e79-7da7-44be-aa4e-737239095c7e\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.233233 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12de1e79-7da7-44be-aa4e-737239095c7e-logs\") pod \"12de1e79-7da7-44be-aa4e-737239095c7e\" (UID: \"12de1e79-7da7-44be-aa4e-737239095c7e\") " Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.234788 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12de1e79-7da7-44be-aa4e-737239095c7e-logs" (OuterVolumeSpecName: "logs") pod "12de1e79-7da7-44be-aa4e-737239095c7e" (UID: "12de1e79-7da7-44be-aa4e-737239095c7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.244639 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12de1e79-7da7-44be-aa4e-737239095c7e-kube-api-access-zpsn6" (OuterVolumeSpecName: "kube-api-access-zpsn6") pod "12de1e79-7da7-44be-aa4e-737239095c7e" (UID: "12de1e79-7da7-44be-aa4e-737239095c7e"). InnerVolumeSpecName "kube-api-access-zpsn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.267859 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12de1e79-7da7-44be-aa4e-737239095c7e" (UID: "12de1e79-7da7-44be-aa4e-737239095c7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.275079 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-config-data" (OuterVolumeSpecName: "config-data") pod "12de1e79-7da7-44be-aa4e-737239095c7e" (UID: "12de1e79-7da7-44be-aa4e-737239095c7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.317579 4958 generic.go:334] "Generic (PLEG): container finished" podID="12de1e79-7da7-44be-aa4e-737239095c7e" containerID="76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018" exitCode=0 Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.317644 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12de1e79-7da7-44be-aa4e-737239095c7e","Type":"ContainerDied","Data":"76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018"} Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.317719 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12de1e79-7da7-44be-aa4e-737239095c7e","Type":"ContainerDied","Data":"4bee92437b17189520792b681f21f8cef838a083c940c5d4ded55229f846de64"} Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.317750 4958 scope.go:117] "RemoveContainer" containerID="76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.317746 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.336043 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpsn6\" (UniqueName: \"kubernetes.io/projected/12de1e79-7da7-44be-aa4e-737239095c7e-kube-api-access-zpsn6\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.336127 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12de1e79-7da7-44be-aa4e-737239095c7e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.336150 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.336165 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de1e79-7da7-44be-aa4e-737239095c7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.386948 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.419025 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.419898 4958 scope.go:117] "RemoveContainer" containerID="7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.431835 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:58 crc kubenswrapper[4958]: E1201 10:24:58.432758 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2433ef78-8e78-439b-a64c-1052adbeed62" containerName="nova-manage" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.432783 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2433ef78-8e78-439b-a64c-1052adbeed62" containerName="nova-manage" Dec 01 10:24:58 crc kubenswrapper[4958]: E1201 10:24:58.432810 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-api" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.432821 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-api" Dec 01 10:24:58 crc kubenswrapper[4958]: E1201 10:24:58.432896 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-log" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.432904 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-log" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.433128 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-api" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.433147 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2433ef78-8e78-439b-a64c-1052adbeed62" containerName="nova-manage" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.433158 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" containerName="nova-api-log" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.437050 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v726k\" (UniqueName: \"kubernetes.io/projected/b836e7e1-9cda-4682-8746-c2be7723836a-kube-api-access-v726k\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.437117 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.437164 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-config-data\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.437209 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b836e7e1-9cda-4682-8746-c2be7723836a-logs\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.438909 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.442740 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.456613 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.477745 4958 scope.go:117] "RemoveContainer" containerID="76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018" Dec 01 10:24:58 crc kubenswrapper[4958]: E1201 10:24:58.478569 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018\": container with ID starting with 76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018 not found: ID does not exist" containerID="76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.478640 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018"} err="failed to get container status \"76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018\": rpc error: code = NotFound desc = could not find container \"76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018\": container with ID starting with 76a5bada33f2612dd7a0d8a9941efd1776d1157693eb0da13087e942bb183018 not found: ID does not exist" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.478692 4958 scope.go:117] "RemoveContainer" containerID="7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb" Dec 01 10:24:58 crc kubenswrapper[4958]: E1201 10:24:58.479161 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb\": container with ID starting with 7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb not found: ID does not exist" containerID="7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.479184 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb"} err="failed to get container status \"7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb\": rpc error: code = NotFound desc = could not find container \"7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb\": container with ID starting with 7aef7889b33bc52c85ec699c5cfbd91dc2ae06a94f442e6911f4f94dc10d8bcb not found: ID does not exist" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.539830 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v726k\" (UniqueName: \"kubernetes.io/projected/b836e7e1-9cda-4682-8746-c2be7723836a-kube-api-access-v726k\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.539928 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.539963 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-config-data\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.540012 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b836e7e1-9cda-4682-8746-c2be7723836a-logs\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.540490 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b836e7e1-9cda-4682-8746-c2be7723836a-logs\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.544680 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-config-data\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.557553 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.557580 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v726k\" (UniqueName: \"kubernetes.io/projected/b836e7e1-9cda-4682-8746-c2be7723836a-kube-api-access-v726k\") pod \"nova-api-0\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " pod="openstack/nova-api-0" Dec 01 10:24:58 crc kubenswrapper[4958]: I1201 10:24:58.772635 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.265503 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.432092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b836e7e1-9cda-4682-8746-c2be7723836a","Type":"ContainerStarted","Data":"6644248a78169d0a24785355571085afe37a27f7771f115efb8a287cebdc6ada"} Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.438013 4958 generic.go:334] "Generic (PLEG): container finished" podID="1dd642ce-2ddf-4d01-893a-09090c99a729" containerID="56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2" exitCode=0 Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.438068 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1dd642ce-2ddf-4d01-893a-09090c99a729","Type":"ContainerDied","Data":"56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2"} Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.569607 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.671353 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhpdk\" (UniqueName: \"kubernetes.io/projected/1dd642ce-2ddf-4d01-893a-09090c99a729-kube-api-access-fhpdk\") pod \"1dd642ce-2ddf-4d01-893a-09090c99a729\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.671713 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-combined-ca-bundle\") pod \"1dd642ce-2ddf-4d01-893a-09090c99a729\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.671782 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-config-data\") pod \"1dd642ce-2ddf-4d01-893a-09090c99a729\" (UID: \"1dd642ce-2ddf-4d01-893a-09090c99a729\") " Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.676669 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd642ce-2ddf-4d01-893a-09090c99a729-kube-api-access-fhpdk" (OuterVolumeSpecName: "kube-api-access-fhpdk") pod "1dd642ce-2ddf-4d01-893a-09090c99a729" (UID: "1dd642ce-2ddf-4d01-893a-09090c99a729"). InnerVolumeSpecName "kube-api-access-fhpdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.705793 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-config-data" (OuterVolumeSpecName: "config-data") pod "1dd642ce-2ddf-4d01-893a-09090c99a729" (UID: "1dd642ce-2ddf-4d01-893a-09090c99a729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.710006 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dd642ce-2ddf-4d01-893a-09090c99a729" (UID: "1dd642ce-2ddf-4d01-893a-09090c99a729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.774350 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhpdk\" (UniqueName: \"kubernetes.io/projected/1dd642ce-2ddf-4d01-893a-09090c99a729-kube-api-access-fhpdk\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.774974 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.775074 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd642ce-2ddf-4d01-893a-09090c99a729-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:59 crc kubenswrapper[4958]: I1201 10:24:59.810026 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12de1e79-7da7-44be-aa4e-737239095c7e" path="/var/lib/kubelet/pods/12de1e79-7da7-44be-aa4e-737239095c7e/volumes" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.451653 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b836e7e1-9cda-4682-8746-c2be7723836a","Type":"ContainerStarted","Data":"d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3"} Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.451735 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b836e7e1-9cda-4682-8746-c2be7723836a","Type":"ContainerStarted","Data":"0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a"} Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.455978 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1dd642ce-2ddf-4d01-893a-09090c99a729","Type":"ContainerDied","Data":"ad4d869d0c851647c0e5993938f36caea022bceba4870a7a43a45d258d6e2957"} Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.456046 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.456088 4958 scope.go:117] "RemoveContainer" containerID="56ca63f965abcc865f39c9ac18e65cf0d0ba29be9846c9cc9ababc85b67336a2" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.488359 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.488332393 podStartE2EDuration="2.488332393s" podCreationTimestamp="2025-12-01 10:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:00.477943964 +0000 UTC m=+1547.986733011" watchObservedRunningTime="2025-12-01 10:25:00.488332393 +0000 UTC m=+1547.997121450" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.520280 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.531280 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.544019 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:00 crc kubenswrapper[4958]: E1201 10:25:00.545091 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd642ce-2ddf-4d01-893a-09090c99a729" containerName="nova-scheduler-scheduler" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.545194 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd642ce-2ddf-4d01-893a-09090c99a729" containerName="nova-scheduler-scheduler" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.545488 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd642ce-2ddf-4d01-893a-09090c99a729" containerName="nova-scheduler-scheduler" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.547216 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.552137 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.562607 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.721041 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g42gh\" (UniqueName: \"kubernetes.io/projected/b7ffc690-747f-4161-8e59-2e4fa48a866f-kube-api-access-g42gh\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.721469 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.721665 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-config-data\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.825129 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.825284 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-config-data\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.825860 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g42gh\" (UniqueName: \"kubernetes.io/projected/b7ffc690-747f-4161-8e59-2e4fa48a866f-kube-api-access-g42gh\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.833519 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.834492 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-config-data\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.846034 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g42gh\" (UniqueName: \"kubernetes.io/projected/b7ffc690-747f-4161-8e59-2e4fa48a866f-kube-api-access-g42gh\") pod \"nova-scheduler-0\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:00 crc kubenswrapper[4958]: I1201 10:25:00.881070 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:01 crc kubenswrapper[4958]: I1201 10:25:01.363549 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:01 crc kubenswrapper[4958]: I1201 10:25:01.473886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7ffc690-747f-4161-8e59-2e4fa48a866f","Type":"ContainerStarted","Data":"732a48a212c44442b77b6a960aa8735ffdb86b5790d5410c1e0fde9f6eb136d8"} Dec 01 10:25:01 crc kubenswrapper[4958]: I1201 10:25:01.811089 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd642ce-2ddf-4d01-893a-09090c99a729" path="/var/lib/kubelet/pods/1dd642ce-2ddf-4d01-893a-09090c99a729/volumes" Dec 01 10:25:02 crc kubenswrapper[4958]: I1201 10:25:02.485053 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7ffc690-747f-4161-8e59-2e4fa48a866f","Type":"ContainerStarted","Data":"a1c7d40252b4c748efeb3517e33e61714f9426c12bfd50801ce28ebe6680bc56"} Dec 01 10:25:02 crc kubenswrapper[4958]: I1201 10:25:02.507208 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.507173482 podStartE2EDuration="2.507173482s" podCreationTimestamp="2025-12-01 10:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:02.502780175 +0000 UTC m=+1550.011569212" watchObservedRunningTime="2025-12-01 10:25:02.507173482 +0000 UTC m=+1550.015962539" Dec 01 10:25:03 crc kubenswrapper[4958]: I1201 10:25:03.702821 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 10:25:05 crc kubenswrapper[4958]: I1201 10:25:05.881296 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 10:25:08 crc kubenswrapper[4958]: I1201 10:25:08.773886 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:25:08 crc kubenswrapper[4958]: I1201 10:25:08.774291 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:25:09 crc kubenswrapper[4958]: I1201 10:25:09.442091 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 10:25:09 crc kubenswrapper[4958]: I1201 10:25:09.856171 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:09 crc kubenswrapper[4958]: I1201 10:25:09.856170 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:10 crc kubenswrapper[4958]: I1201 10:25:10.882275 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 10:25:10 crc kubenswrapper[4958]: I1201 10:25:10.914603 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 10:25:11 crc kubenswrapper[4958]: I1201 10:25:11.622832 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.411466 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.412806 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" containerName="kube-state-metrics" containerID="cri-o://dfc9bb1038666dfe92ff786ea1156854bbc06c6602a0a174a585d2b6f8fc8d23" gracePeriod=30 Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.615154 4958 generic.go:334] "Generic (PLEG): container finished" podID="e0248403-94d5-4bec-9ef1-af83490d3a0e" containerID="dfc9bb1038666dfe92ff786ea1156854bbc06c6602a0a174a585d2b6f8fc8d23" exitCode=2 Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.615215 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0248403-94d5-4bec-9ef1-af83490d3a0e","Type":"ContainerDied","Data":"dfc9bb1038666dfe92ff786ea1156854bbc06c6602a0a174a585d2b6f8fc8d23"} Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.953647 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.985534 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdz7t\" (UniqueName: \"kubernetes.io/projected/e0248403-94d5-4bec-9ef1-af83490d3a0e-kube-api-access-sdz7t\") pod \"e0248403-94d5-4bec-9ef1-af83490d3a0e\" (UID: \"e0248403-94d5-4bec-9ef1-af83490d3a0e\") " Dec 01 10:25:13 crc kubenswrapper[4958]: I1201 10:25:13.995362 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0248403-94d5-4bec-9ef1-af83490d3a0e-kube-api-access-sdz7t" (OuterVolumeSpecName: "kube-api-access-sdz7t") pod "e0248403-94d5-4bec-9ef1-af83490d3a0e" (UID: "e0248403-94d5-4bec-9ef1-af83490d3a0e"). InnerVolumeSpecName "kube-api-access-sdz7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.093397 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdz7t\" (UniqueName: \"kubernetes.io/projected/e0248403-94d5-4bec-9ef1-af83490d3a0e-kube-api-access-sdz7t\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.634690 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0248403-94d5-4bec-9ef1-af83490d3a0e","Type":"ContainerDied","Data":"49b377a4f9220eb6e34747df1280f6e8b08b3191f1540a45f23595ae4961d698"} Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.634801 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.635089 4958 scope.go:117] "RemoveContainer" containerID="dfc9bb1038666dfe92ff786ea1156854bbc06c6602a0a174a585d2b6f8fc8d23" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.734958 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.750949 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.768350 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:25:14 crc kubenswrapper[4958]: E1201 10:25:14.769143 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" containerName="kube-state-metrics" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.769176 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" containerName="kube-state-metrics" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.769537 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" containerName="kube-state-metrics" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.770720 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.780069 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.786698 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.791481 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.922320 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.924404 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.924634 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:14 crc kubenswrapper[4958]: I1201 10:25:14.924745 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jpl\" (UniqueName: \"kubernetes.io/projected/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-api-access-k2jpl\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.027425 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.027644 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.027692 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.027738 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jpl\" (UniqueName: \"kubernetes.io/projected/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-api-access-k2jpl\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.035088 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.035115 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.035458 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.052545 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jpl\" (UniqueName: \"kubernetes.io/projected/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-api-access-k2jpl\") pod \"kube-state-metrics-0\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.117386 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.573651 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.574899 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-central-agent" containerID="cri-o://cd451f35975b7fb664d8eb29a07761f86a7c8fca0866c7b22c82a51901e4f21f" gracePeriod=30 Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.575129 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="proxy-httpd" containerID="cri-o://e99a3e70e39d99263831941f6cce24905ce5e58ef2d76e0a70b8f07c7dc4b74a" gracePeriod=30 Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.575297 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="sg-core" containerID="cri-o://9b6b6da1cec09efa51d0d84e5ac7405e757de5847aa50bc34a6ac6f39b079740" gracePeriod=30 Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.575414 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-notification-agent" containerID="cri-o://c261922366b23228b7baa6538188b1b14bab1b177143b266dd74c203af62785c" gracePeriod=30 Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.657687 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.674623 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a4ce268c-ffb9-4eae-93e5-23a10ba96185","Type":"ContainerStarted","Data":"f9534d967249eaf5c794a26a6460d4a7182d9f383f0186adf6136ec7ddc6f49a"} Dec 01 10:25:15 crc kubenswrapper[4958]: I1201 10:25:15.814430 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0248403-94d5-4bec-9ef1-af83490d3a0e" path="/var/lib/kubelet/pods/e0248403-94d5-4bec-9ef1-af83490d3a0e/volumes" Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.691122 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a4ce268c-ffb9-4eae-93e5-23a10ba96185","Type":"ContainerStarted","Data":"cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa"} Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.692040 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707401 4958 generic.go:334] "Generic (PLEG): container finished" podID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerID="e99a3e70e39d99263831941f6cce24905ce5e58ef2d76e0a70b8f07c7dc4b74a" exitCode=0 Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707446 4958 generic.go:334] "Generic (PLEG): container finished" podID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerID="9b6b6da1cec09efa51d0d84e5ac7405e757de5847aa50bc34a6ac6f39b079740" exitCode=2 Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707454 4958 generic.go:334] "Generic (PLEG): container finished" podID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerID="c261922366b23228b7baa6538188b1b14bab1b177143b266dd74c203af62785c" exitCode=0 Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707462 4958 generic.go:334] "Generic (PLEG): container finished" podID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerID="cd451f35975b7fb664d8eb29a07761f86a7c8fca0866c7b22c82a51901e4f21f" exitCode=0 Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707488 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerDied","Data":"e99a3e70e39d99263831941f6cce24905ce5e58ef2d76e0a70b8f07c7dc4b74a"} Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707522 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerDied","Data":"9b6b6da1cec09efa51d0d84e5ac7405e757de5847aa50bc34a6ac6f39b079740"} Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707533 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerDied","Data":"c261922366b23228b7baa6538188b1b14bab1b177143b266dd74c203af62785c"} Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.707545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerDied","Data":"cd451f35975b7fb664d8eb29a07761f86a7c8fca0866c7b22c82a51901e4f21f"} Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.741948 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.297154525 podStartE2EDuration="2.741916063s" podCreationTimestamp="2025-12-01 10:25:14 +0000 UTC" firstStartedPulling="2025-12-01 10:25:15.658834288 +0000 UTC m=+1563.167623325" lastFinishedPulling="2025-12-01 10:25:16.103595826 +0000 UTC m=+1563.612384863" observedRunningTime="2025-12-01 10:25:16.728908438 +0000 UTC m=+1564.237697475" watchObservedRunningTime="2025-12-01 10:25:16.741916063 +0000 UTC m=+1564.250705110" Dec 01 10:25:16 crc kubenswrapper[4958]: I1201 10:25:16.888705 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:16.999529 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-sg-core-conf-yaml\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.000361 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-log-httpd\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.000416 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-config-data\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.000455 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-scripts\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.000643 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-combined-ca-bundle\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.000695 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr6p8\" (UniqueName: \"kubernetes.io/projected/32b48e0e-eb10-4c19-a505-e82b2617fd54-kube-api-access-nr6p8\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.000827 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-run-httpd\") pod \"32b48e0e-eb10-4c19-a505-e82b2617fd54\" (UID: \"32b48e0e-eb10-4c19-a505-e82b2617fd54\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.003359 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.004223 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.016226 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-scripts" (OuterVolumeSpecName: "scripts") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.017297 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b48e0e-eb10-4c19-a505-e82b2617fd54-kube-api-access-nr6p8" (OuterVolumeSpecName: "kube-api-access-nr6p8") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "kube-api-access-nr6p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.064091 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.110048 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr6p8\" (UniqueName: \"kubernetes.io/projected/32b48e0e-eb10-4c19-a505-e82b2617fd54-kube-api-access-nr6p8\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.110131 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.110148 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.110159 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/32b48e0e-eb10-4c19-a505-e82b2617fd54-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.110170 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.151076 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-config-data" (OuterVolumeSpecName: "config-data") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.166582 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32b48e0e-eb10-4c19-a505-e82b2617fd54" (UID: "32b48e0e-eb10-4c19-a505-e82b2617fd54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.212160 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.212719 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b48e0e-eb10-4c19-a505-e82b2617fd54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.539158 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.626404 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.635268 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-config-data\") pod \"6932e428-bf22-45fe-a500-f8082c039d0b\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.635455 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kvsf\" (UniqueName: \"kubernetes.io/projected/6932e428-bf22-45fe-a500-f8082c039d0b-kube-api-access-7kvsf\") pod \"6932e428-bf22-45fe-a500-f8082c039d0b\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.635519 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-combined-ca-bundle\") pod \"6932e428-bf22-45fe-a500-f8082c039d0b\" (UID: \"6932e428-bf22-45fe-a500-f8082c039d0b\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.650041 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6932e428-bf22-45fe-a500-f8082c039d0b-kube-api-access-7kvsf" (OuterVolumeSpecName: "kube-api-access-7kvsf") pod "6932e428-bf22-45fe-a500-f8082c039d0b" (UID: "6932e428-bf22-45fe-a500-f8082c039d0b"). InnerVolumeSpecName "kube-api-access-7kvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.678879 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6932e428-bf22-45fe-a500-f8082c039d0b" (UID: "6932e428-bf22-45fe-a500-f8082c039d0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.689387 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-config-data" (OuterVolumeSpecName: "config-data") pod "6932e428-bf22-45fe-a500-f8082c039d0b" (UID: "6932e428-bf22-45fe-a500-f8082c039d0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.722041 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.722003 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"32b48e0e-eb10-4c19-a505-e82b2617fd54","Type":"ContainerDied","Data":"6045dab28055a78678c4b155e57051d084d7b372bb440e1ae1736e4a319ccf12"} Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.722198 4958 scope.go:117] "RemoveContainer" containerID="e99a3e70e39d99263831941f6cce24905ce5e58ef2d76e0a70b8f07c7dc4b74a" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.724909 4958 generic.go:334] "Generic (PLEG): container finished" podID="6932e428-bf22-45fe-a500-f8082c039d0b" containerID="624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6" exitCode=137 Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.725096 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932e428-bf22-45fe-a500-f8082c039d0b","Type":"ContainerDied","Data":"624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6"} Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.725218 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6932e428-bf22-45fe-a500-f8082c039d0b","Type":"ContainerDied","Data":"b2d62874d2335e8ded1c5964a3e468128b2f56fa0e2616a9e357974bc715e119"} Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.726623 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.730121 4958 generic.go:334] "Generic (PLEG): container finished" podID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerID="a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450" exitCode=137 Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.731408 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.731564 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e96a15-eefc-47b6-98c0-60c772de185e","Type":"ContainerDied","Data":"a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450"} Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.732823 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e96a15-eefc-47b6-98c0-60c772de185e","Type":"ContainerDied","Data":"b7349f6fee1fd1189b6ff92e78acfb512bb3fa1451213bc33405dcc380b00b6c"} Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.739024 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-config-data\") pod \"c7e96a15-eefc-47b6-98c0-60c772de185e\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.739120 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e96a15-eefc-47b6-98c0-60c772de185e-logs\") pod \"c7e96a15-eefc-47b6-98c0-60c772de185e\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.739262 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nhqf\" (UniqueName: \"kubernetes.io/projected/c7e96a15-eefc-47b6-98c0-60c772de185e-kube-api-access-6nhqf\") pod \"c7e96a15-eefc-47b6-98c0-60c772de185e\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.739472 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-combined-ca-bundle\") pod \"c7e96a15-eefc-47b6-98c0-60c772de185e\" (UID: \"c7e96a15-eefc-47b6-98c0-60c772de185e\") " Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.739678 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e96a15-eefc-47b6-98c0-60c772de185e-logs" (OuterVolumeSpecName: "logs") pod "c7e96a15-eefc-47b6-98c0-60c772de185e" (UID: "c7e96a15-eefc-47b6-98c0-60c772de185e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.740380 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.740426 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e96a15-eefc-47b6-98c0-60c772de185e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.740442 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kvsf\" (UniqueName: \"kubernetes.io/projected/6932e428-bf22-45fe-a500-f8082c039d0b-kube-api-access-7kvsf\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.740461 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6932e428-bf22-45fe-a500-f8082c039d0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.745497 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e96a15-eefc-47b6-98c0-60c772de185e-kube-api-access-6nhqf" (OuterVolumeSpecName: "kube-api-access-6nhqf") pod "c7e96a15-eefc-47b6-98c0-60c772de185e" (UID: "c7e96a15-eefc-47b6-98c0-60c772de185e"). InnerVolumeSpecName "kube-api-access-6nhqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.756488 4958 scope.go:117] "RemoveContainer" containerID="9b6b6da1cec09efa51d0d84e5ac7405e757de5847aa50bc34a6ac6f39b079740" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.786006 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-config-data" (OuterVolumeSpecName: "config-data") pod "c7e96a15-eefc-47b6-98c0-60c772de185e" (UID: "c7e96a15-eefc-47b6-98c0-60c772de185e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.786120 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.794865 4958 scope.go:117] "RemoveContainer" containerID="c261922366b23228b7baa6538188b1b14bab1b177143b266dd74c203af62785c" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.823505 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e96a15-eefc-47b6-98c0-60c772de185e" (UID: "c7e96a15-eefc-47b6-98c0-60c772de185e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837103 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837162 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837566 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-log" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837585 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-log" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837613 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="proxy-httpd" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837620 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="proxy-httpd" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837632 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6932e428-bf22-45fe-a500-f8082c039d0b" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837638 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6932e428-bf22-45fe-a500-f8082c039d0b" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837649 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="sg-core" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837655 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="sg-core" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837667 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-central-agent" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837674 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-central-agent" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837684 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-metadata" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837690 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-metadata" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.837717 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-notification-agent" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837724 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-notification-agent" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.837994 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-log" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.838013 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6932e428-bf22-45fe-a500-f8082c039d0b" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.838034 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-central-agent" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.838048 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="ceilometer-notification-agent" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.838057 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="proxy-httpd" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.838069 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" containerName="nova-metadata-metadata" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.838077 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" containerName="sg-core" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.841324 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.841517 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.842725 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.842779 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nhqf\" (UniqueName: \"kubernetes.io/projected/c7e96a15-eefc-47b6-98c0-60c772de185e-kube-api-access-6nhqf\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.842794 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e96a15-eefc-47b6-98c0-60c772de185e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.846512 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.846831 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.847415 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.855921 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.877352 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.878060 4958 scope.go:117] "RemoveContainer" containerID="cd451f35975b7fb664d8eb29a07761f86a7c8fca0866c7b22c82a51901e4f21f" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.905505 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.908515 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.911767 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.912139 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.912476 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.919945 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.933473 4958 scope.go:117] "RemoveContainer" containerID="624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945501 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945613 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945654 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945684 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-log-httpd\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945712 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-run-httpd\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945797 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945911 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-config-data\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.945962 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.946258 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-scripts\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.946447 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqpx\" (UniqueName: \"kubernetes.io/projected/8e18a1be-f832-499c-a518-50dad90cafc9-kube-api-access-vrqpx\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.946658 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.946829 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.947291 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g88q\" (UniqueName: \"kubernetes.io/projected/1176586f-1551-451b-8630-82c16eb35a4b-kube-api-access-4g88q\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.958452 4958 scope.go:117] "RemoveContainer" containerID="624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6" Dec 01 10:25:17 crc kubenswrapper[4958]: E1201 10:25:17.958989 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6\": container with ID starting with 624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6 not found: ID does not exist" containerID="624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.959060 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6"} err="failed to get container status \"624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6\": rpc error: code = NotFound desc = could not find container \"624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6\": container with ID starting with 624fb2ef5f1dd7f36e9506293b10415350cd6f01387cd8037b226b9622d3f7b6 not found: ID does not exist" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.959170 4958 scope.go:117] "RemoveContainer" containerID="a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450" Dec 01 10:25:17 crc kubenswrapper[4958]: I1201 10:25:17.983582 4958 scope.go:117] "RemoveContainer" containerID="7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.008164 4958 scope.go:117] "RemoveContainer" containerID="a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450" Dec 01 10:25:18 crc kubenswrapper[4958]: E1201 10:25:18.013696 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450\": container with ID starting with a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450 not found: ID does not exist" containerID="a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.013756 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450"} err="failed to get container status \"a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450\": rpc error: code = NotFound desc = could not find container \"a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450\": container with ID starting with a87b783f86a593bb0b7a5007a42d4041cc9a4e20e81b628c482bddb04f3e0450 not found: ID does not exist" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.013806 4958 scope.go:117] "RemoveContainer" containerID="7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d" Dec 01 10:25:18 crc kubenswrapper[4958]: E1201 10:25:18.014281 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d\": container with ID starting with 7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d not found: ID does not exist" containerID="7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.014308 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d"} err="failed to get container status \"7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d\": rpc error: code = NotFound desc = could not find container \"7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d\": container with ID starting with 7b0120025398535e9d85938c8f26efa0177a02d1abf12d580c9698cb7a91595d not found: ID does not exist" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.049481 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g88q\" (UniqueName: \"kubernetes.io/projected/1176586f-1551-451b-8630-82c16eb35a4b-kube-api-access-4g88q\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050023 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050083 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050144 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050195 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-log-httpd\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050221 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-run-httpd\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050263 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-config-data\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050282 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050303 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-scripts\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050346 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqpx\" (UniqueName: \"kubernetes.io/projected/8e18a1be-f832-499c-a518-50dad90cafc9-kube-api-access-vrqpx\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050392 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.050435 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.051705 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-run-httpd\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.052254 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-log-httpd\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.057102 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.057129 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.058653 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.059355 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.059418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.059929 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.060445 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-config-data\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.064776 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-scripts\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.068905 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.078343 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqpx\" (UniqueName: \"kubernetes.io/projected/8e18a1be-f832-499c-a518-50dad90cafc9-kube-api-access-vrqpx\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.078804 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g88q\" (UniqueName: \"kubernetes.io/projected/1176586f-1551-451b-8630-82c16eb35a4b-kube-api-access-4g88q\") pod \"ceilometer-0\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.189133 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.192281 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.206298 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.228193 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.231349 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.237862 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.239092 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.239331 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.243111 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.255078 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.255149 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533e79a5-c45e-42a3-828c-ef2b0b0ce569-logs\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.255221 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-config-data\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.255254 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22p2\" (UniqueName: \"kubernetes.io/projected/533e79a5-c45e-42a3-828c-ef2b0b0ce569-kube-api-access-s22p2\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.255286 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.361268 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.361340 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533e79a5-c45e-42a3-828c-ef2b0b0ce569-logs\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.361456 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-config-data\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.361491 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22p2\" (UniqueName: \"kubernetes.io/projected/533e79a5-c45e-42a3-828c-ef2b0b0ce569-kube-api-access-s22p2\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.361517 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.363636 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533e79a5-c45e-42a3-828c-ef2b0b0ce569-logs\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.368323 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.491034 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.506001 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-config-data\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.521241 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22p2\" (UniqueName: \"kubernetes.io/projected/533e79a5-c45e-42a3-828c-ef2b0b0ce569-kube-api-access-s22p2\") pod \"nova-metadata-0\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.545923 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.779007 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.779677 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.781488 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.798401 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.908998 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: W1201 10:25:18.911436 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod533e79a5_c45e_42a3_828c_ef2b0b0ce569.slice/crio-5464af2f40c3697e399f3d6cbb37a51f8444f54b58102b257569150be4e2cd3b WatchSource:0}: Error finding container 5464af2f40c3697e399f3d6cbb37a51f8444f54b58102b257569150be4e2cd3b: Status 404 returned error can't find the container with id 5464af2f40c3697e399f3d6cbb37a51f8444f54b58102b257569150be4e2cd3b Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.932029 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: I1201 10:25:18.979718 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:25:18 crc kubenswrapper[4958]: W1201 10:25:18.986834 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e18a1be_f832_499c_a518_50dad90cafc9.slice/crio-d2fadc4de066f6fb651416ad9dd57ef198ecee3855c4c69a982a18945645906c WatchSource:0}: Error finding container d2fadc4de066f6fb651416ad9dd57ef198ecee3855c4c69a982a18945645906c: Status 404 returned error can't find the container with id d2fadc4de066f6fb651416ad9dd57ef198ecee3855c4c69a982a18945645906c Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.763307 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533e79a5-c45e-42a3-828c-ef2b0b0ce569","Type":"ContainerStarted","Data":"e94400eccb46cd34d7f89b975380d542a64bdc8039758852759fa41a552d36f1"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.763692 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533e79a5-c45e-42a3-828c-ef2b0b0ce569","Type":"ContainerStarted","Data":"a9f653232c0a42e82d33efa999f358ee28d59b2f0bbe0e228667bebb3a1a338f"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.763710 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533e79a5-c45e-42a3-828c-ef2b0b0ce569","Type":"ContainerStarted","Data":"5464af2f40c3697e399f3d6cbb37a51f8444f54b58102b257569150be4e2cd3b"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.767306 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e18a1be-f832-499c-a518-50dad90cafc9","Type":"ContainerStarted","Data":"322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.767343 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e18a1be-f832-499c-a518-50dad90cafc9","Type":"ContainerStarted","Data":"d2fadc4de066f6fb651416ad9dd57ef198ecee3855c4c69a982a18945645906c"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.770342 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerStarted","Data":"382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.770468 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerStarted","Data":"62ac801151a1b7014a67a2c55dcd9391e8dc01df5e32db70281d49542a39ae35"} Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.770510 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.775760 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.795097 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.795073442 podStartE2EDuration="1.795073442s" podCreationTimestamp="2025-12-01 10:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:19.785154856 +0000 UTC m=+1567.293943893" watchObservedRunningTime="2025-12-01 10:25:19.795073442 +0000 UTC m=+1567.303862479" Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.816586 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b48e0e-eb10-4c19-a505-e82b2617fd54" path="/var/lib/kubelet/pods/32b48e0e-eb10-4c19-a505-e82b2617fd54/volumes" Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.820932 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6932e428-bf22-45fe-a500-f8082c039d0b" path="/var/lib/kubelet/pods/6932e428-bf22-45fe-a500-f8082c039d0b/volumes" Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.822603 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e96a15-eefc-47b6-98c0-60c772de185e" path="/var/lib/kubelet/pods/c7e96a15-eefc-47b6-98c0-60c772de185e/volumes" Dec 01 10:25:19 crc kubenswrapper[4958]: I1201 10:25:19.833436 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.833407745 podStartE2EDuration="2.833407745s" podCreationTimestamp="2025-12-01 10:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:19.81169871 +0000 UTC m=+1567.320487767" watchObservedRunningTime="2025-12-01 10:25:19.833407745 +0000 UTC m=+1567.342196782" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.289365 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4p4kj"] Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.300567 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.316491 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-config\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.316550 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.316616 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5nn\" (UniqueName: \"kubernetes.io/projected/7e1110df-28ad-4b93-ad3b-54d771229959-kube-api-access-6r5nn\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.316665 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.316769 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.316821 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.319554 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4p4kj"] Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.419413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5nn\" (UniqueName: \"kubernetes.io/projected/7e1110df-28ad-4b93-ad3b-54d771229959-kube-api-access-6r5nn\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.419497 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.419595 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.419630 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.419666 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-config\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.419689 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.429732 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.437033 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.437048 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.442167 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.445520 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-config\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.477702 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5nn\" (UniqueName: \"kubernetes.io/projected/7e1110df-28ad-4b93-ad3b-54d771229959-kube-api-access-6r5nn\") pod \"dnsmasq-dns-89c5cd4d5-4p4kj\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.540114 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:20 crc kubenswrapper[4958]: I1201 10:25:20.810401 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerStarted","Data":"303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6"} Dec 01 10:25:21 crc kubenswrapper[4958]: I1201 10:25:21.106278 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4p4kj"] Dec 01 10:25:21 crc kubenswrapper[4958]: W1201 10:25:21.123178 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e1110df_28ad_4b93_ad3b_54d771229959.slice/crio-01d0431fa6b569a7e97c7666632005762608c83ec16ae1df82e372bfd9881aad WatchSource:0}: Error finding container 01d0431fa6b569a7e97c7666632005762608c83ec16ae1df82e372bfd9881aad: Status 404 returned error can't find the container with id 01d0431fa6b569a7e97c7666632005762608c83ec16ae1df82e372bfd9881aad Dec 01 10:25:21 crc kubenswrapper[4958]: I1201 10:25:21.823858 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerStarted","Data":"f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d"} Dec 01 10:25:21 crc kubenswrapper[4958]: I1201 10:25:21.826987 4958 generic.go:334] "Generic (PLEG): container finished" podID="7e1110df-28ad-4b93-ad3b-54d771229959" containerID="06d8019ce9173b9ad8867a99fdfaebe7781104806d5ab186f185a02eb0a3a524" exitCode=0 Dec 01 10:25:21 crc kubenswrapper[4958]: I1201 10:25:21.827114 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" event={"ID":"7e1110df-28ad-4b93-ad3b-54d771229959","Type":"ContainerDied","Data":"06d8019ce9173b9ad8867a99fdfaebe7781104806d5ab186f185a02eb0a3a524"} Dec 01 10:25:21 crc kubenswrapper[4958]: I1201 10:25:21.827213 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" event={"ID":"7e1110df-28ad-4b93-ad3b-54d771229959","Type":"ContainerStarted","Data":"01d0431fa6b569a7e97c7666632005762608c83ec16ae1df82e372bfd9881aad"} Dec 01 10:25:22 crc kubenswrapper[4958]: I1201 10:25:22.842079 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" event={"ID":"7e1110df-28ad-4b93-ad3b-54d771229959","Type":"ContainerStarted","Data":"7331ebc18b17e0727114d123f2215ce8c358366e2d09e664d2eac5884cbc980a"} Dec 01 10:25:22 crc kubenswrapper[4958]: I1201 10:25:22.843159 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:22 crc kubenswrapper[4958]: I1201 10:25:22.875659 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" podStartSLOduration=2.875634319 podStartE2EDuration="2.875634319s" podCreationTimestamp="2025-12-01 10:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:22.863351026 +0000 UTC m=+1570.372140063" watchObservedRunningTime="2025-12-01 10:25:22.875634319 +0000 UTC m=+1570.384423356" Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.082081 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.082378 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-log" containerID="cri-o://0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a" gracePeriod=30 Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.082977 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-api" containerID="cri-o://d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3" gracePeriod=30 Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.240640 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.547755 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.548203 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.855797 4958 generic.go:334] "Generic (PLEG): container finished" podID="b836e7e1-9cda-4682-8746-c2be7723836a" containerID="0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a" exitCode=143 Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.855921 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b836e7e1-9cda-4682-8746-c2be7723836a","Type":"ContainerDied","Data":"0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a"} Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.862205 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerStarted","Data":"44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745"} Dec 01 10:25:23 crc kubenswrapper[4958]: I1201 10:25:23.896734 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.585835531 podStartE2EDuration="6.896702819s" podCreationTimestamp="2025-12-01 10:25:17 +0000 UTC" firstStartedPulling="2025-12-01 10:25:18.923080741 +0000 UTC m=+1566.431869778" lastFinishedPulling="2025-12-01 10:25:23.233948029 +0000 UTC m=+1570.742737066" observedRunningTime="2025-12-01 10:25:23.889505892 +0000 UTC m=+1571.398294929" watchObservedRunningTime="2025-12-01 10:25:23.896702819 +0000 UTC m=+1571.405491856" Dec 01 10:25:24 crc kubenswrapper[4958]: I1201 10:25:24.302758 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:24 crc kubenswrapper[4958]: I1201 10:25:24.873497 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:25:25 crc kubenswrapper[4958]: I1201 10:25:25.134295 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 10:25:25 crc kubenswrapper[4958]: I1201 10:25:25.882930 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-central-agent" containerID="cri-o://382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d" gracePeriod=30 Dec 01 10:25:25 crc kubenswrapper[4958]: I1201 10:25:25.883163 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-notification-agent" containerID="cri-o://303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6" gracePeriod=30 Dec 01 10:25:25 crc kubenswrapper[4958]: I1201 10:25:25.883203 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="sg-core" containerID="cri-o://f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d" gracePeriod=30 Dec 01 10:25:25 crc kubenswrapper[4958]: I1201 10:25:25.883375 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="proxy-httpd" containerID="cri-o://44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745" gracePeriod=30 Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.798913 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.896862 4958 generic.go:334] "Generic (PLEG): container finished" podID="b836e7e1-9cda-4682-8746-c2be7723836a" containerID="d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3" exitCode=0 Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.897148 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b836e7e1-9cda-4682-8746-c2be7723836a","Type":"ContainerDied","Data":"d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3"} Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.897511 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b836e7e1-9cda-4682-8746-c2be7723836a","Type":"ContainerDied","Data":"6644248a78169d0a24785355571085afe37a27f7771f115efb8a287cebdc6ada"} Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.897548 4958 scope.go:117] "RemoveContainer" containerID="d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.897213 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.905246 4958 generic.go:334] "Generic (PLEG): container finished" podID="1176586f-1551-451b-8630-82c16eb35a4b" containerID="44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745" exitCode=0 Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.905295 4958 generic.go:334] "Generic (PLEG): container finished" podID="1176586f-1551-451b-8630-82c16eb35a4b" containerID="f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d" exitCode=2 Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.905305 4958 generic.go:334] "Generic (PLEG): container finished" podID="1176586f-1551-451b-8630-82c16eb35a4b" containerID="303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6" exitCode=0 Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.905345 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerDied","Data":"44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745"} Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.905381 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerDied","Data":"f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d"} Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.905393 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerDied","Data":"303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6"} Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.928719 4958 scope.go:117] "RemoveContainer" containerID="0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.954466 4958 scope.go:117] "RemoveContainer" containerID="d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3" Dec 01 10:25:26 crc kubenswrapper[4958]: E1201 10:25:26.955371 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3\": container with ID starting with d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3 not found: ID does not exist" containerID="d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.955464 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3"} err="failed to get container status \"d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3\": rpc error: code = NotFound desc = could not find container \"d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3\": container with ID starting with d241872bc1a9989cd6277bda8c72582927f92217d2d1c1887c4716f71c00b3e3 not found: ID does not exist" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.955520 4958 scope.go:117] "RemoveContainer" containerID="0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a" Dec 01 10:25:26 crc kubenswrapper[4958]: E1201 10:25:26.956148 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a\": container with ID starting with 0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a not found: ID does not exist" containerID="0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.956174 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a"} err="failed to get container status \"0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a\": rpc error: code = NotFound desc = could not find container \"0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a\": container with ID starting with 0c3790e0327baf2cb85785f48abfba6f747e92832f1f916c995ea32cee44160a not found: ID does not exist" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.983929 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-config-data\") pod \"b836e7e1-9cda-4682-8746-c2be7723836a\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.984098 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b836e7e1-9cda-4682-8746-c2be7723836a-logs\") pod \"b836e7e1-9cda-4682-8746-c2be7723836a\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.984166 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v726k\" (UniqueName: \"kubernetes.io/projected/b836e7e1-9cda-4682-8746-c2be7723836a-kube-api-access-v726k\") pod \"b836e7e1-9cda-4682-8746-c2be7723836a\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.984196 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-combined-ca-bundle\") pod \"b836e7e1-9cda-4682-8746-c2be7723836a\" (UID: \"b836e7e1-9cda-4682-8746-c2be7723836a\") " Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.985102 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b836e7e1-9cda-4682-8746-c2be7723836a-logs" (OuterVolumeSpecName: "logs") pod "b836e7e1-9cda-4682-8746-c2be7723836a" (UID: "b836e7e1-9cda-4682-8746-c2be7723836a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.986958 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b836e7e1-9cda-4682-8746-c2be7723836a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:26 crc kubenswrapper[4958]: I1201 10:25:26.992218 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b836e7e1-9cda-4682-8746-c2be7723836a-kube-api-access-v726k" (OuterVolumeSpecName: "kube-api-access-v726k") pod "b836e7e1-9cda-4682-8746-c2be7723836a" (UID: "b836e7e1-9cda-4682-8746-c2be7723836a"). InnerVolumeSpecName "kube-api-access-v726k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.035941 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-config-data" (OuterVolumeSpecName: "config-data") pod "b836e7e1-9cda-4682-8746-c2be7723836a" (UID: "b836e7e1-9cda-4682-8746-c2be7723836a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.046026 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b836e7e1-9cda-4682-8746-c2be7723836a" (UID: "b836e7e1-9cda-4682-8746-c2be7723836a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.090272 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v726k\" (UniqueName: \"kubernetes.io/projected/b836e7e1-9cda-4682-8746-c2be7723836a-kube-api-access-v726k\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.090318 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.090329 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b836e7e1-9cda-4682-8746-c2be7723836a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.239757 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.251498 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.270727 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:27 crc kubenswrapper[4958]: E1201 10:25:27.271307 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-log" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.271326 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-log" Dec 01 10:25:27 crc kubenswrapper[4958]: E1201 10:25:27.271361 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-api" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.271368 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-api" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.271648 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-api" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.271677 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" containerName="nova-api-log" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.273153 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.276349 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.276507 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.277321 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.289693 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.295343 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.295430 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-logs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.295541 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.295580 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.295618 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-config-data\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.295652 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj778\" (UniqueName: \"kubernetes.io/projected/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-kube-api-access-qj778\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.397645 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-logs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.397714 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.397765 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.397806 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-config-data\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.397876 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj778\" (UniqueName: \"kubernetes.io/projected/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-kube-api-access-qj778\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.397971 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.398461 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-logs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.403342 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.404067 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-config-data\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.405292 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.416224 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.418495 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj778\" (UniqueName: \"kubernetes.io/projected/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-kube-api-access-qj778\") pod \"nova-api-0\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.596241 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:27 crc kubenswrapper[4958]: I1201 10:25:27.821262 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b836e7e1-9cda-4682-8746-c2be7723836a" path="/var/lib/kubelet/pods/b836e7e1-9cda-4682-8746-c2be7723836a/volumes" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.094540 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.210920 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.211007 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.211073 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.212221 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.212318 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" gracePeriod=600 Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.239009 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.274384 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:28 crc kubenswrapper[4958]: E1201 10:25:28.385557 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.546991 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.547383 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.710612 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855393 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-run-httpd\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855555 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-combined-ca-bundle\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855614 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-sg-core-conf-yaml\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855699 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-scripts\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855812 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-config-data\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855918 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.855928 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g88q\" (UniqueName: \"kubernetes.io/projected/1176586f-1551-451b-8630-82c16eb35a4b-kube-api-access-4g88q\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.856062 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-ceilometer-tls-certs\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.856111 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-log-httpd\") pod \"1176586f-1551-451b-8630-82c16eb35a4b\" (UID: \"1176586f-1551-451b-8630-82c16eb35a4b\") " Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.856721 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.857074 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.862386 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-scripts" (OuterVolumeSpecName: "scripts") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.864497 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1176586f-1551-451b-8630-82c16eb35a4b-kube-api-access-4g88q" (OuterVolumeSpecName: "kube-api-access-4g88q") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "kube-api-access-4g88q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.896596 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.935136 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.961801 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.961868 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.961938 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g88q\" (UniqueName: \"kubernetes.io/projected/1176586f-1551-451b-8630-82c16eb35a4b-kube-api-access-4g88q\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.961956 4958 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.961969 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1176586f-1551-451b-8630-82c16eb35a4b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.984706 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" exitCode=0 Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.984809 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4"} Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.984875 4958 scope.go:117] "RemoveContainer" containerID="f09314f73af3b199ee4f78ab6cf71768969c699a4967f9d91c7d9dc73162183f" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.986748 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:25:28 crc kubenswrapper[4958]: E1201 10:25:28.987809 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.990881 4958 generic.go:334] "Generic (PLEG): container finished" podID="1176586f-1551-451b-8630-82c16eb35a4b" containerID="382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d" exitCode=0 Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.990991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerDied","Data":"382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d"} Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.991034 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1176586f-1551-451b-8630-82c16eb35a4b","Type":"ContainerDied","Data":"62ac801151a1b7014a67a2c55dcd9391e8dc01df5e32db70281d49542a39ae35"} Dec 01 10:25:28 crc kubenswrapper[4958]: I1201 10:25:28.991317 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.008004 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e","Type":"ContainerStarted","Data":"19010a84f7a9568ef1901b6e9182956061c06e9494330293000da7664685ff8b"} Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.008074 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e","Type":"ContainerStarted","Data":"18da8c88d92e54dbbd624b41111a4e6c2b2cc7e1208368257aa006815485f972"} Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.008086 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e","Type":"ContainerStarted","Data":"540f8736596daebad6fd99b1f46ad61b14134d0a28b81b6b0f1efc39fef23de1"} Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.012137 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.049102 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.077813 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.077788957 podStartE2EDuration="2.077788957s" podCreationTimestamp="2025-12-01 10:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:29.064066462 +0000 UTC m=+1576.572855499" watchObservedRunningTime="2025-12-01 10:25:29.077788957 +0000 UTC m=+1576.586577994" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.080237 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.146287 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-config-data" (OuterVolumeSpecName: "config-data") pod "1176586f-1551-451b-8630-82c16eb35a4b" (UID: "1176586f-1551-451b-8630-82c16eb35a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.195341 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176586f-1551-451b-8630-82c16eb35a4b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.253053 4958 scope.go:117] "RemoveContainer" containerID="44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.295027 4958 scope.go:117] "RemoveContainer" containerID="f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.298912 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-nnnl2"] Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.299531 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="proxy-httpd" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.299626 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="proxy-httpd" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.299973 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="sg-core" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300043 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="sg-core" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.300116 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-notification-agent" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300173 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-notification-agent" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.300249 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-central-agent" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300303 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-central-agent" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300577 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="proxy-httpd" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300652 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-central-agent" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300714 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="sg-core" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.300784 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1176586f-1551-451b-8630-82c16eb35a4b" containerName="ceilometer-notification-agent" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.301862 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.305416 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.306830 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.328083 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnnl2"] Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.339374 4958 scope.go:117] "RemoveContainer" containerID="303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.374711 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.401293 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.403884 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-scripts\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.403990 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.404030 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57dz\" (UniqueName: \"kubernetes.io/projected/a947fbcc-a378-4d94-a37b-75164b9e2746-kube-api-access-x57dz\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.404078 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-config-data\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.418721 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.421257 4958 scope.go:117] "RemoveContainer" containerID="382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.421997 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.430560 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.430954 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.431141 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.442694 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.465143 4958 scope.go:117] "RemoveContainer" containerID="44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.465885 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745\": container with ID starting with 44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745 not found: ID does not exist" containerID="44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.465955 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745"} err="failed to get container status \"44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745\": rpc error: code = NotFound desc = could not find container \"44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745\": container with ID starting with 44515626bb844e0b917140eea59bbbf049111ed442b2783ee5f56d6d7ba9a745 not found: ID does not exist" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.465992 4958 scope.go:117] "RemoveContainer" containerID="f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.466531 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d\": container with ID starting with f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d not found: ID does not exist" containerID="f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.466574 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d"} err="failed to get container status \"f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d\": rpc error: code = NotFound desc = could not find container \"f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d\": container with ID starting with f26952a500aa4af6a5ea51695f4afb5a755ae5a628bfd623788f1822cc97596d not found: ID does not exist" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.466601 4958 scope.go:117] "RemoveContainer" containerID="303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.466995 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6\": container with ID starting with 303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6 not found: ID does not exist" containerID="303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.467068 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6"} err="failed to get container status \"303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6\": rpc error: code = NotFound desc = could not find container \"303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6\": container with ID starting with 303be2cee97b3b3ab44ad625910cb79d316f8796c9493c7363bfc267dbaa9fc6 not found: ID does not exist" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.467114 4958 scope.go:117] "RemoveContainer" containerID="382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d" Dec 01 10:25:29 crc kubenswrapper[4958]: E1201 10:25:29.467594 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d\": container with ID starting with 382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d not found: ID does not exist" containerID="382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.467651 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d"} err="failed to get container status \"382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d\": rpc error: code = NotFound desc = could not find container \"382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d\": container with ID starting with 382b818d9df0a25b912b07d33275b0aa758cd129d25c2ec4c9aac388f728430d not found: ID does not exist" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506582 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506617 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4bbf\" (UniqueName: \"kubernetes.io/projected/89868f5c-dfd8-4619-9b7e-02a5b75916db-kube-api-access-c4bbf\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506650 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-scripts\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506741 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-config-data\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506799 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506821 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506888 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57dz\" (UniqueName: \"kubernetes.io/projected/a947fbcc-a378-4d94-a37b-75164b9e2746-kube-api-access-x57dz\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506909 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-log-httpd\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506948 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-scripts\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.506976 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-run-httpd\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.507022 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-config-data\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.514139 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-config-data\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.516144 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-scripts\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.517137 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.545078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57dz\" (UniqueName: \"kubernetes.io/projected/a947fbcc-a378-4d94-a37b-75164b9e2746-kube-api-access-x57dz\") pod \"nova-cell1-cell-mapping-nnnl2\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.563328 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.563416 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.608877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-scripts\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.609360 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-run-httpd\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.609521 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.609688 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.609790 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4bbf\" (UniqueName: \"kubernetes.io/projected/89868f5c-dfd8-4619-9b7e-02a5b75916db-kube-api-access-c4bbf\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.609989 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-config-data\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.610121 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.610242 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-log-httpd\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.610300 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-run-httpd\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.611660 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-log-httpd\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.613259 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-scripts\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.614102 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-config-data\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.617098 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.624341 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.626131 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.632638 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4bbf\" (UniqueName: \"kubernetes.io/projected/89868f5c-dfd8-4619-9b7e-02a5b75916db-kube-api-access-c4bbf\") pod \"ceilometer-0\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.646008 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.755069 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:25:29 crc kubenswrapper[4958]: I1201 10:25:29.871192 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1176586f-1551-451b-8630-82c16eb35a4b" path="/var/lib/kubelet/pods/1176586f-1551-451b-8630-82c16eb35a4b/volumes" Dec 01 10:25:30 crc kubenswrapper[4958]: I1201 10:25:30.215986 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnnl2"] Dec 01 10:25:30 crc kubenswrapper[4958]: I1201 10:25:30.426166 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:25:30 crc kubenswrapper[4958]: W1201 10:25:30.442594 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89868f5c_dfd8_4619_9b7e_02a5b75916db.slice/crio-4b04c97099c975e03c6bba9c7249d424fb6a89d33d51c4af29cee645cc680b1e WatchSource:0}: Error finding container 4b04c97099c975e03c6bba9c7249d424fb6a89d33d51c4af29cee645cc680b1e: Status 404 returned error can't find the container with id 4b04c97099c975e03c6bba9c7249d424fb6a89d33d51c4af29cee645cc680b1e Dec 01 10:25:30 crc kubenswrapper[4958]: I1201 10:25:30.542731 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:25:30 crc kubenswrapper[4958]: I1201 10:25:30.649140 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-8g67m"] Dec 01 10:25:30 crc kubenswrapper[4958]: I1201 10:25:30.649408 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="dnsmasq-dns" containerID="cri-o://71c9e0138da3d2aec6aa6687165d5da89aba3185c7b7522e5005093c2d2518b5" gracePeriod=10 Dec 01 10:25:30 crc kubenswrapper[4958]: I1201 10:25:30.713143 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: connect: connection refused" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.056712 4958 generic.go:334] "Generic (PLEG): container finished" podID="67eb34e2-8e1c-4adf-a952-59028475c264" containerID="71c9e0138da3d2aec6aa6687165d5da89aba3185c7b7522e5005093c2d2518b5" exitCode=0 Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.057510 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" event={"ID":"67eb34e2-8e1c-4adf-a952-59028475c264","Type":"ContainerDied","Data":"71c9e0138da3d2aec6aa6687165d5da89aba3185c7b7522e5005093c2d2518b5"} Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.066498 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerStarted","Data":"4b04c97099c975e03c6bba9c7249d424fb6a89d33d51c4af29cee645cc680b1e"} Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.069991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnnl2" event={"ID":"a947fbcc-a378-4d94-a37b-75164b9e2746","Type":"ContainerStarted","Data":"05071c092989d02335035ebd45342eedf9166ca955a2f85c2d8550907c62b4ea"} Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.070032 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnnl2" event={"ID":"a947fbcc-a378-4d94-a37b-75164b9e2746","Type":"ContainerStarted","Data":"68680a85b8561d7ce4af895465a37b952749ce76899bb229523574ae987ab1d8"} Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.124013 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-nnnl2" podStartSLOduration=2.123987232 podStartE2EDuration="2.123987232s" podCreationTimestamp="2025-12-01 10:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:31.09435294 +0000 UTC m=+1578.603141977" watchObservedRunningTime="2025-12-01 10:25:31.123987232 +0000 UTC m=+1578.632776269" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.389943 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.508614 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-swift-storage-0\") pod \"67eb34e2-8e1c-4adf-a952-59028475c264\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.509137 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-svc\") pod \"67eb34e2-8e1c-4adf-a952-59028475c264\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.509183 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9h8\" (UniqueName: \"kubernetes.io/projected/67eb34e2-8e1c-4adf-a952-59028475c264-kube-api-access-9p9h8\") pod \"67eb34e2-8e1c-4adf-a952-59028475c264\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.509216 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-sb\") pod \"67eb34e2-8e1c-4adf-a952-59028475c264\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.509242 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-nb\") pod \"67eb34e2-8e1c-4adf-a952-59028475c264\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.509412 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-config\") pod \"67eb34e2-8e1c-4adf-a952-59028475c264\" (UID: \"67eb34e2-8e1c-4adf-a952-59028475c264\") " Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.517632 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67eb34e2-8e1c-4adf-a952-59028475c264-kube-api-access-9p9h8" (OuterVolumeSpecName: "kube-api-access-9p9h8") pod "67eb34e2-8e1c-4adf-a952-59028475c264" (UID: "67eb34e2-8e1c-4adf-a952-59028475c264"). InnerVolumeSpecName "kube-api-access-9p9h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.582048 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67eb34e2-8e1c-4adf-a952-59028475c264" (UID: "67eb34e2-8e1c-4adf-a952-59028475c264"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.584242 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67eb34e2-8e1c-4adf-a952-59028475c264" (UID: "67eb34e2-8e1c-4adf-a952-59028475c264"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.584722 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-config" (OuterVolumeSpecName: "config") pod "67eb34e2-8e1c-4adf-a952-59028475c264" (UID: "67eb34e2-8e1c-4adf-a952-59028475c264"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.590460 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67eb34e2-8e1c-4adf-a952-59028475c264" (UID: "67eb34e2-8e1c-4adf-a952-59028475c264"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.603114 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67eb34e2-8e1c-4adf-a952-59028475c264" (UID: "67eb34e2-8e1c-4adf-a952-59028475c264"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.612117 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.612172 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.612187 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.612198 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9h8\" (UniqueName: \"kubernetes.io/projected/67eb34e2-8e1c-4adf-a952-59028475c264-kube-api-access-9p9h8\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.612207 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:31 crc kubenswrapper[4958]: I1201 10:25:31.612217 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67eb34e2-8e1c-4adf-a952-59028475c264-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.083325 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerStarted","Data":"3fbf7d32df1fc1b787081dc2dd704d72d22c756d3674cb93e42c1d2472d4bef9"} Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.087626 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" event={"ID":"67eb34e2-8e1c-4adf-a952-59028475c264","Type":"ContainerDied","Data":"85d850ca45510904c96aeebfb708316f04b09e9434e81a8f98297d9fae7ac68e"} Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.087682 4958 scope.go:117] "RemoveContainer" containerID="71c9e0138da3d2aec6aa6687165d5da89aba3185c7b7522e5005093c2d2518b5" Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.087700 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-8g67m" Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.119579 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-8g67m"] Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.138197 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-8g67m"] Dec 01 10:25:32 crc kubenswrapper[4958]: I1201 10:25:32.164654 4958 scope.go:117] "RemoveContainer" containerID="634a27e51b8589382e9f7cf2d5851095850c183e089b20da7ea3ab2f630172e7" Dec 01 10:25:33 crc kubenswrapper[4958]: I1201 10:25:33.103563 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerStarted","Data":"a6020adfcd7117df2c055f1f046c4057d506ced6a9066aeab5c10fdf00691823"} Dec 01 10:25:33 crc kubenswrapper[4958]: I1201 10:25:33.813927 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" path="/var/lib/kubelet/pods/67eb34e2-8e1c-4adf-a952-59028475c264/volumes" Dec 01 10:25:34 crc kubenswrapper[4958]: I1201 10:25:34.119084 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerStarted","Data":"f54faead50baed98cc76dab5ac5bc05271f5999149758dd0bd4a4895989a1b1b"} Dec 01 10:25:36 crc kubenswrapper[4958]: I1201 10:25:36.145634 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerStarted","Data":"d2b247b18d958afcd723731638919280c2072270eb4c5c41496ef47a22a4f23a"} Dec 01 10:25:36 crc kubenswrapper[4958]: I1201 10:25:36.146244 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:25:36 crc kubenswrapper[4958]: I1201 10:25:36.175038 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.281306339 podStartE2EDuration="7.175010567s" podCreationTimestamp="2025-12-01 10:25:29 +0000 UTC" firstStartedPulling="2025-12-01 10:25:30.446084947 +0000 UTC m=+1577.954873984" lastFinishedPulling="2025-12-01 10:25:35.339789175 +0000 UTC m=+1582.848578212" observedRunningTime="2025-12-01 10:25:36.171240339 +0000 UTC m=+1583.680029376" watchObservedRunningTime="2025-12-01 10:25:36.175010567 +0000 UTC m=+1583.683799604" Dec 01 10:25:37 crc kubenswrapper[4958]: I1201 10:25:37.158706 4958 generic.go:334] "Generic (PLEG): container finished" podID="a947fbcc-a378-4d94-a37b-75164b9e2746" containerID="05071c092989d02335035ebd45342eedf9166ca955a2f85c2d8550907c62b4ea" exitCode=0 Dec 01 10:25:37 crc kubenswrapper[4958]: I1201 10:25:37.158801 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnnl2" event={"ID":"a947fbcc-a378-4d94-a37b-75164b9e2746","Type":"ContainerDied","Data":"05071c092989d02335035ebd45342eedf9166ca955a2f85c2d8550907c62b4ea"} Dec 01 10:25:37 crc kubenswrapper[4958]: I1201 10:25:37.597380 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:25:37 crc kubenswrapper[4958]: I1201 10:25:37.597452 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.560471 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.564342 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.568136 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.611052 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.611449 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.619173 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.819404 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-config-data\") pod \"a947fbcc-a378-4d94-a37b-75164b9e2746\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.819737 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-combined-ca-bundle\") pod \"a947fbcc-a378-4d94-a37b-75164b9e2746\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.819832 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57dz\" (UniqueName: \"kubernetes.io/projected/a947fbcc-a378-4d94-a37b-75164b9e2746-kube-api-access-x57dz\") pod \"a947fbcc-a378-4d94-a37b-75164b9e2746\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.819884 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-scripts\") pod \"a947fbcc-a378-4d94-a37b-75164b9e2746\" (UID: \"a947fbcc-a378-4d94-a37b-75164b9e2746\") " Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.827984 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-scripts" (OuterVolumeSpecName: "scripts") pod "a947fbcc-a378-4d94-a37b-75164b9e2746" (UID: "a947fbcc-a378-4d94-a37b-75164b9e2746"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.829285 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a947fbcc-a378-4d94-a37b-75164b9e2746-kube-api-access-x57dz" (OuterVolumeSpecName: "kube-api-access-x57dz") pod "a947fbcc-a378-4d94-a37b-75164b9e2746" (UID: "a947fbcc-a378-4d94-a37b-75164b9e2746"). InnerVolumeSpecName "kube-api-access-x57dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.856228 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a947fbcc-a378-4d94-a37b-75164b9e2746" (UID: "a947fbcc-a378-4d94-a37b-75164b9e2746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.867597 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-config-data" (OuterVolumeSpecName: "config-data") pod "a947fbcc-a378-4d94-a37b-75164b9e2746" (UID: "a947fbcc-a378-4d94-a37b-75164b9e2746"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.922984 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.923026 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x57dz\" (UniqueName: \"kubernetes.io/projected/a947fbcc-a378-4d94-a37b-75164b9e2746-kube-api-access-x57dz\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.923040 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:38 crc kubenswrapper[4958]: I1201 10:25:38.923051 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a947fbcc-a378-4d94-a37b-75164b9e2746-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.182769 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnnl2" event={"ID":"a947fbcc-a378-4d94-a37b-75164b9e2746","Type":"ContainerDied","Data":"68680a85b8561d7ce4af895465a37b952749ce76899bb229523574ae987ab1d8"} Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.182904 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68680a85b8561d7ce4af895465a37b952749ce76899bb229523574ae987ab1d8" Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.183052 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnnl2" Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.194138 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.419954 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.420450 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-log" containerID="cri-o://18da8c88d92e54dbbd624b41111a4e6c2b2cc7e1208368257aa006815485f972" gracePeriod=30 Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.421340 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-api" containerID="cri-o://19010a84f7a9568ef1901b6e9182956061c06e9494330293000da7664685ff8b" gracePeriod=30 Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.445257 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.446050 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b7ffc690-747f-4161-8e59-2e4fa48a866f" containerName="nova-scheduler-scheduler" containerID="cri-o://a1c7d40252b4c748efeb3517e33e61714f9426c12bfd50801ce28ebe6680bc56" gracePeriod=30 Dec 01 10:25:39 crc kubenswrapper[4958]: I1201 10:25:39.467181 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.217252 4958 generic.go:334] "Generic (PLEG): container finished" podID="b7ffc690-747f-4161-8e59-2e4fa48a866f" containerID="a1c7d40252b4c748efeb3517e33e61714f9426c12bfd50801ce28ebe6680bc56" exitCode=0 Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.217354 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7ffc690-747f-4161-8e59-2e4fa48a866f","Type":"ContainerDied","Data":"a1c7d40252b4c748efeb3517e33e61714f9426c12bfd50801ce28ebe6680bc56"} Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.224303 4958 generic.go:334] "Generic (PLEG): container finished" podID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerID="18da8c88d92e54dbbd624b41111a4e6c2b2cc7e1208368257aa006815485f972" exitCode=143 Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.225561 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e","Type":"ContainerDied","Data":"18da8c88d92e54dbbd624b41111a4e6c2b2cc7e1208368257aa006815485f972"} Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.505971 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.671469 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-combined-ca-bundle\") pod \"b7ffc690-747f-4161-8e59-2e4fa48a866f\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.671694 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g42gh\" (UniqueName: \"kubernetes.io/projected/b7ffc690-747f-4161-8e59-2e4fa48a866f-kube-api-access-g42gh\") pod \"b7ffc690-747f-4161-8e59-2e4fa48a866f\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.671731 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-config-data\") pod \"b7ffc690-747f-4161-8e59-2e4fa48a866f\" (UID: \"b7ffc690-747f-4161-8e59-2e4fa48a866f\") " Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.687608 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ffc690-747f-4161-8e59-2e4fa48a866f-kube-api-access-g42gh" (OuterVolumeSpecName: "kube-api-access-g42gh") pod "b7ffc690-747f-4161-8e59-2e4fa48a866f" (UID: "b7ffc690-747f-4161-8e59-2e4fa48a866f"). InnerVolumeSpecName "kube-api-access-g42gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.711320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7ffc690-747f-4161-8e59-2e4fa48a866f" (UID: "b7ffc690-747f-4161-8e59-2e4fa48a866f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.728984 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-config-data" (OuterVolumeSpecName: "config-data") pod "b7ffc690-747f-4161-8e59-2e4fa48a866f" (UID: "b7ffc690-747f-4161-8e59-2e4fa48a866f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.774587 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.774636 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g42gh\" (UniqueName: \"kubernetes.io/projected/b7ffc690-747f-4161-8e59-2e4fa48a866f-kube-api-access-g42gh\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:40 crc kubenswrapper[4958]: I1201 10:25:40.774650 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ffc690-747f-4161-8e59-2e4fa48a866f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.237804 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7ffc690-747f-4161-8e59-2e4fa48a866f","Type":"ContainerDied","Data":"732a48a212c44442b77b6a960aa8735ffdb86b5790d5410c1e0fde9f6eb136d8"} Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.237922 4958 scope.go:117] "RemoveContainer" containerID="a1c7d40252b4c748efeb3517e33e61714f9426c12bfd50801ce28ebe6680bc56" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.237950 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.238023 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-log" containerID="cri-o://a9f653232c0a42e82d33efa999f358ee28d59b2f0bbe0e228667bebb3a1a338f" gracePeriod=30 Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.238163 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-metadata" containerID="cri-o://e94400eccb46cd34d7f89b975380d542a64bdc8039758852759fa41a552d36f1" gracePeriod=30 Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.310999 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.320424 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.335312 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:41 crc kubenswrapper[4958]: E1201 10:25:41.336170 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7ffc690-747f-4161-8e59-2e4fa48a866f" containerName="nova-scheduler-scheduler" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336195 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7ffc690-747f-4161-8e59-2e4fa48a866f" containerName="nova-scheduler-scheduler" Dec 01 10:25:41 crc kubenswrapper[4958]: E1201 10:25:41.336221 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="init" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336231 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="init" Dec 01 10:25:41 crc kubenswrapper[4958]: E1201 10:25:41.336241 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="dnsmasq-dns" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336249 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="dnsmasq-dns" Dec 01 10:25:41 crc kubenswrapper[4958]: E1201 10:25:41.336262 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a947fbcc-a378-4d94-a37b-75164b9e2746" containerName="nova-manage" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336268 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947fbcc-a378-4d94-a37b-75164b9e2746" containerName="nova-manage" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336479 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="67eb34e2-8e1c-4adf-a952-59028475c264" containerName="dnsmasq-dns" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336494 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7ffc690-747f-4161-8e59-2e4fa48a866f" containerName="nova-scheduler-scheduler" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.336507 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a947fbcc-a378-4d94-a37b-75164b9e2746" containerName="nova-manage" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.337393 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.341088 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.352734 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.491452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-config-data\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.491959 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qw8h\" (UniqueName: \"kubernetes.io/projected/de0d4422-f9bb-4abf-8727-066813a9182e-kube-api-access-7qw8h\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.492050 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.594667 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-config-data\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.594877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qw8h\" (UniqueName: \"kubernetes.io/projected/de0d4422-f9bb-4abf-8727-066813a9182e-kube-api-access-7qw8h\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.594916 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.603085 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.606822 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-config-data\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.617817 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qw8h\" (UniqueName: \"kubernetes.io/projected/de0d4422-f9bb-4abf-8727-066813a9182e-kube-api-access-7qw8h\") pod \"nova-scheduler-0\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.708466 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.800011 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:25:41 crc kubenswrapper[4958]: E1201 10:25:41.800832 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:25:41 crc kubenswrapper[4958]: I1201 10:25:41.811598 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ffc690-747f-4161-8e59-2e4fa48a866f" path="/var/lib/kubelet/pods/b7ffc690-747f-4161-8e59-2e4fa48a866f/volumes" Dec 01 10:25:42 crc kubenswrapper[4958]: I1201 10:25:42.215871 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:25:42 crc kubenswrapper[4958]: W1201 10:25:42.225419 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde0d4422_f9bb_4abf_8727_066813a9182e.slice/crio-aacf6c67146e3ac1dc84d9564a6bca05321603499f5b30393501e9347b843b03 WatchSource:0}: Error finding container aacf6c67146e3ac1dc84d9564a6bca05321603499f5b30393501e9347b843b03: Status 404 returned error can't find the container with id aacf6c67146e3ac1dc84d9564a6bca05321603499f5b30393501e9347b843b03 Dec 01 10:25:42 crc kubenswrapper[4958]: I1201 10:25:42.261384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de0d4422-f9bb-4abf-8727-066813a9182e","Type":"ContainerStarted","Data":"aacf6c67146e3ac1dc84d9564a6bca05321603499f5b30393501e9347b843b03"} Dec 01 10:25:42 crc kubenswrapper[4958]: I1201 10:25:42.268821 4958 generic.go:334] "Generic (PLEG): container finished" podID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerID="a9f653232c0a42e82d33efa999f358ee28d59b2f0bbe0e228667bebb3a1a338f" exitCode=143 Dec 01 10:25:42 crc kubenswrapper[4958]: I1201 10:25:42.268925 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533e79a5-c45e-42a3-828c-ef2b0b0ce569","Type":"ContainerDied","Data":"a9f653232c0a42e82d33efa999f358ee28d59b2f0bbe0e228667bebb3a1a338f"} Dec 01 10:25:43 crc kubenswrapper[4958]: I1201 10:25:43.282071 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de0d4422-f9bb-4abf-8727-066813a9182e","Type":"ContainerStarted","Data":"058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a"} Dec 01 10:25:44 crc kubenswrapper[4958]: I1201 10:25:44.422413 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:53042->10.217.0.193:8775: read: connection reset by peer" Dec 01 10:25:44 crc kubenswrapper[4958]: I1201 10:25:44.423244 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:53046->10.217.0.193:8775: read: connection reset by peer" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.304362 4958 generic.go:334] "Generic (PLEG): container finished" podID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerID="19010a84f7a9568ef1901b6e9182956061c06e9494330293000da7664685ff8b" exitCode=0 Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.304984 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e","Type":"ContainerDied","Data":"19010a84f7a9568ef1901b6e9182956061c06e9494330293000da7664685ff8b"} Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.342692 4958 generic.go:334] "Generic (PLEG): container finished" podID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerID="e94400eccb46cd34d7f89b975380d542a64bdc8039758852759fa41a552d36f1" exitCode=0 Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.342761 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533e79a5-c45e-42a3-828c-ef2b0b0ce569","Type":"ContainerDied","Data":"e94400eccb46cd34d7f89b975380d542a64bdc8039758852759fa41a552d36f1"} Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.632099 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.651112 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.685309 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.685261958 podStartE2EDuration="4.685261958s" podCreationTimestamp="2025-12-01 10:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:43.309462949 +0000 UTC m=+1590.818251986" watchObservedRunningTime="2025-12-01 10:25:45.685261958 +0000 UTC m=+1593.194050985" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.771880 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-config-data\") pod \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.771974 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s22p2\" (UniqueName: \"kubernetes.io/projected/533e79a5-c45e-42a3-828c-ef2b0b0ce569-kube-api-access-s22p2\") pod \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772006 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-combined-ca-bundle\") pod \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-public-tls-certs\") pod \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772207 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-internal-tls-certs\") pod \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772242 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-combined-ca-bundle\") pod \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772288 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj778\" (UniqueName: \"kubernetes.io/projected/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-kube-api-access-qj778\") pod \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-logs\") pod \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\" (UID: \"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772400 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-config-data\") pod \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772467 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-nova-metadata-tls-certs\") pod \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.772507 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533e79a5-c45e-42a3-828c-ef2b0b0ce569-logs\") pod \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\" (UID: \"533e79a5-c45e-42a3-828c-ef2b0b0ce569\") " Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.774098 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533e79a5-c45e-42a3-828c-ef2b0b0ce569-logs" (OuterVolumeSpecName: "logs") pod "533e79a5-c45e-42a3-828c-ef2b0b0ce569" (UID: "533e79a5-c45e-42a3-828c-ef2b0b0ce569"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.774764 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-logs" (OuterVolumeSpecName: "logs") pod "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" (UID: "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.780566 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-kube-api-access-qj778" (OuterVolumeSpecName: "kube-api-access-qj778") pod "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" (UID: "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e"). InnerVolumeSpecName "kube-api-access-qj778". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.794169 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533e79a5-c45e-42a3-828c-ef2b0b0ce569-kube-api-access-s22p2" (OuterVolumeSpecName: "kube-api-access-s22p2") pod "533e79a5-c45e-42a3-828c-ef2b0b0ce569" (UID: "533e79a5-c45e-42a3-828c-ef2b0b0ce569"). InnerVolumeSpecName "kube-api-access-s22p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.806024 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "533e79a5-c45e-42a3-828c-ef2b0b0ce569" (UID: "533e79a5-c45e-42a3-828c-ef2b0b0ce569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.812788 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-config-data" (OuterVolumeSpecName: "config-data") pod "533e79a5-c45e-42a3-828c-ef2b0b0ce569" (UID: "533e79a5-c45e-42a3-828c-ef2b0b0ce569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.815950 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-config-data" (OuterVolumeSpecName: "config-data") pod "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" (UID: "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.817483 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" (UID: "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.842814 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" (UID: "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.852727 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "533e79a5-c45e-42a3-828c-ef2b0b0ce569" (UID: "533e79a5-c45e-42a3-828c-ef2b0b0ce569"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874799 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874878 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s22p2\" (UniqueName: \"kubernetes.io/projected/533e79a5-c45e-42a3-828c-ef2b0b0ce569-kube-api-access-s22p2\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874891 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874900 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874910 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874920 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj778\" (UniqueName: \"kubernetes.io/projected/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-kube-api-access-qj778\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874932 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874941 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874954 4958 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/533e79a5-c45e-42a3-828c-ef2b0b0ce569-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.874965 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/533e79a5-c45e-42a3-828c-ef2b0b0ce569-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.877721 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" (UID: "ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:25:45 crc kubenswrapper[4958]: I1201 10:25:45.976771 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.357782 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e","Type":"ContainerDied","Data":"540f8736596daebad6fd99b1f46ad61b14134d0a28b81b6b0f1efc39fef23de1"} Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.358206 4958 scope.go:117] "RemoveContainer" containerID="19010a84f7a9568ef1901b6e9182956061c06e9494330293000da7664685ff8b" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.357831 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.364053 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"533e79a5-c45e-42a3-828c-ef2b0b0ce569","Type":"ContainerDied","Data":"5464af2f40c3697e399f3d6cbb37a51f8444f54b58102b257569150be4e2cd3b"} Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.364176 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.396962 4958 scope.go:117] "RemoveContainer" containerID="18da8c88d92e54dbbd624b41111a4e6c2b2cc7e1208368257aa006815485f972" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.402823 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.423822 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.446258 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.464136 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.470727 4958 scope.go:117] "RemoveContainer" containerID="e94400eccb46cd34d7f89b975380d542a64bdc8039758852759fa41a552d36f1" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.471651 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: E1201 10:25:46.472420 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-log" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472457 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-log" Dec 01 10:25:46 crc kubenswrapper[4958]: E1201 10:25:46.472479 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-log" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472490 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-log" Dec 01 10:25:46 crc kubenswrapper[4958]: E1201 10:25:46.472524 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-metadata" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472532 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-metadata" Dec 01 10:25:46 crc kubenswrapper[4958]: E1201 10:25:46.472568 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-api" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472578 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-api" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472877 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-log" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472905 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-log" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472923 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" containerName="nova-api-api" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.472943 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" containerName="nova-metadata-metadata" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.474467 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.479631 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.480038 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.480233 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.485021 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.487814 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.497983 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.501127 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.506102 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.506412 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.549964 4958 scope.go:117] "RemoveContainer" containerID="a9f653232c0a42e82d33efa999f358ee28d59b2f0bbe0e228667bebb3a1a338f" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.589905 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-config-data\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.589958 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.589983 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-config-data\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.590087 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.590157 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-logs\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.590309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2gbt\" (UniqueName: \"kubernetes.io/projected/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-kube-api-access-q2gbt\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.590497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzsx\" (UniqueName: \"kubernetes.io/projected/79797aa2-db7b-429a-91d9-3e181de3976c-kube-api-access-pgzsx\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.590793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.591010 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79797aa2-db7b-429a-91d9-3e181de3976c-logs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.591176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.591246 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-public-tls-certs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693563 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-public-tls-certs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693655 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-config-data\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693687 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693711 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-config-data\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693734 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693761 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-logs\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693798 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2gbt\" (UniqueName: \"kubernetes.io/projected/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-kube-api-access-q2gbt\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693830 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzsx\" (UniqueName: \"kubernetes.io/projected/79797aa2-db7b-429a-91d9-3e181de3976c-kube-api-access-pgzsx\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693909 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.693939 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79797aa2-db7b-429a-91d9-3e181de3976c-logs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.694598 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79797aa2-db7b-429a-91d9-3e181de3976c-logs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.696116 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-logs\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.702656 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-public-tls-certs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.702750 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.702819 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.704398 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-config-data\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.708829 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.709735 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-config-data\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.709827 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.712176 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.716181 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzsx\" (UniqueName: \"kubernetes.io/projected/79797aa2-db7b-429a-91d9-3e181de3976c-kube-api-access-pgzsx\") pod \"nova-api-0\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.720030 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2gbt\" (UniqueName: \"kubernetes.io/projected/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-kube-api-access-q2gbt\") pod \"nova-metadata-0\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " pod="openstack/nova-metadata-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.848835 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:25:46 crc kubenswrapper[4958]: I1201 10:25:46.855936 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:25:47 crc kubenswrapper[4958]: I1201 10:25:47.337837 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:25:47 crc kubenswrapper[4958]: I1201 10:25:47.381170 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79797aa2-db7b-429a-91d9-3e181de3976c","Type":"ContainerStarted","Data":"638a7d1084fba5123fe154909b77078a77a1032d920622d2ef416d3afc8eb68d"} Dec 01 10:25:47 crc kubenswrapper[4958]: I1201 10:25:47.413036 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:25:47 crc kubenswrapper[4958]: W1201 10:25:47.413940 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37ea0b7c_fb6b_4802_9fff_4e3995c0ed14.slice/crio-5df8b89b368ce2bb3715c01ac36fd1cc19440e1747577a5d2b98f0687ec1ba5c WatchSource:0}: Error finding container 5df8b89b368ce2bb3715c01ac36fd1cc19440e1747577a5d2b98f0687ec1ba5c: Status 404 returned error can't find the container with id 5df8b89b368ce2bb3715c01ac36fd1cc19440e1747577a5d2b98f0687ec1ba5c Dec 01 10:25:47 crc kubenswrapper[4958]: I1201 10:25:47.834859 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="533e79a5-c45e-42a3-828c-ef2b0b0ce569" path="/var/lib/kubelet/pods/533e79a5-c45e-42a3-828c-ef2b0b0ce569/volumes" Dec 01 10:25:47 crc kubenswrapper[4958]: I1201 10:25:47.837016 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e" path="/var/lib/kubelet/pods/ff4de0b7-aee4-4d10-8d80-ac89aa7ad16e/volumes" Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.401663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14","Type":"ContainerStarted","Data":"be4dcfae53ebf0f5679008f3ab2b2488e6eec364d8b4f3e7506d7a1e58ea2248"} Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.401728 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14","Type":"ContainerStarted","Data":"ca5f316669c8cd32825cef769b0e3e1c6eab56cc41d4a4cff5edd8cd75e0ad40"} Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.401740 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14","Type":"ContainerStarted","Data":"5df8b89b368ce2bb3715c01ac36fd1cc19440e1747577a5d2b98f0687ec1ba5c"} Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.404949 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79797aa2-db7b-429a-91d9-3e181de3976c","Type":"ContainerStarted","Data":"a6094764dd390e4786a93ea3889e9a8d68d2c1d8357ff0e022f148ebe28f5c52"} Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.405117 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79797aa2-db7b-429a-91d9-3e181de3976c","Type":"ContainerStarted","Data":"c5d92e349f45e4a0af02c0d2ddcebd7a3e9e9b1433ebbd079fd3f0a98c18c1b3"} Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.433104 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.433076922 podStartE2EDuration="2.433076922s" podCreationTimestamp="2025-12-01 10:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:48.421292833 +0000 UTC m=+1595.930081870" watchObservedRunningTime="2025-12-01 10:25:48.433076922 +0000 UTC m=+1595.941865959" Dec 01 10:25:48 crc kubenswrapper[4958]: I1201 10:25:48.454192 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.454148848 podStartE2EDuration="2.454148848s" podCreationTimestamp="2025-12-01 10:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:25:48.445746286 +0000 UTC m=+1595.954535343" watchObservedRunningTime="2025-12-01 10:25:48.454148848 +0000 UTC m=+1595.962937885" Dec 01 10:25:51 crc kubenswrapper[4958]: I1201 10:25:51.709657 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 10:25:51 crc kubenswrapper[4958]: I1201 10:25:51.752033 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 10:25:51 crc kubenswrapper[4958]: I1201 10:25:51.856492 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:25:51 crc kubenswrapper[4958]: I1201 10:25:51.856556 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 10:25:52 crc kubenswrapper[4958]: I1201 10:25:52.485923 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 10:25:54 crc kubenswrapper[4958]: I1201 10:25:54.839879 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:25:54 crc kubenswrapper[4958]: E1201 10:25:54.841481 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:25:56 crc kubenswrapper[4958]: I1201 10:25:56.850021 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:25:56 crc kubenswrapper[4958]: I1201 10:25:56.850427 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 10:25:56 crc kubenswrapper[4958]: I1201 10:25:56.856362 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:25:56 crc kubenswrapper[4958]: I1201 10:25:56.856427 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 10:25:57 crc kubenswrapper[4958]: I1201 10:25:57.917127 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:57 crc kubenswrapper[4958]: I1201 10:25:57.924181 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:57 crc kubenswrapper[4958]: I1201 10:25:57.924307 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:57 crc kubenswrapper[4958]: I1201 10:25:57.931238 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 10:25:59 crc kubenswrapper[4958]: I1201 10:25:59.767091 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.692271 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ww5c5"] Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.695736 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.704580 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ww5c5"] Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.820721 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-catalog-content\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.820809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kgs2\" (UniqueName: \"kubernetes.io/projected/054057a9-764f-494d-9ae7-387cb051c0dd-kube-api-access-4kgs2\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.820984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-utilities\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.923462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-utilities\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.924191 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-utilities\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.924336 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-catalog-content\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.924466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kgs2\" (UniqueName: \"kubernetes.io/projected/054057a9-764f-494d-9ae7-387cb051c0dd-kube-api-access-4kgs2\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.924974 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-catalog-content\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:04 crc kubenswrapper[4958]: I1201 10:26:04.949823 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kgs2\" (UniqueName: \"kubernetes.io/projected/054057a9-764f-494d-9ae7-387cb051c0dd-kube-api-access-4kgs2\") pod \"certified-operators-ww5c5\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:05 crc kubenswrapper[4958]: I1201 10:26:05.080407 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:05 crc kubenswrapper[4958]: I1201 10:26:05.631364 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ww5c5"] Dec 01 10:26:05 crc kubenswrapper[4958]: W1201 10:26:05.635653 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod054057a9_764f_494d_9ae7_387cb051c0dd.slice/crio-2a0f917a2a6bc95fad35941235c648d9f6791ed7bdb12f38bb5903e31409a143 WatchSource:0}: Error finding container 2a0f917a2a6bc95fad35941235c648d9f6791ed7bdb12f38bb5903e31409a143: Status 404 returned error can't find the container with id 2a0f917a2a6bc95fad35941235c648d9f6791ed7bdb12f38bb5903e31409a143 Dec 01 10:26:05 crc kubenswrapper[4958]: I1201 10:26:05.799467 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:26:05 crc kubenswrapper[4958]: E1201 10:26:05.800304 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.619165 4958 generic.go:334] "Generic (PLEG): container finished" podID="054057a9-764f-494d-9ae7-387cb051c0dd" containerID="1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6" exitCode=0 Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.619280 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerDied","Data":"1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6"} Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.619550 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerStarted","Data":"2a0f917a2a6bc95fad35941235c648d9f6791ed7bdb12f38bb5903e31409a143"} Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.861036 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.861167 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.861884 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.861917 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.867130 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.867813 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.868905 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.882667 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:26:06 crc kubenswrapper[4958]: I1201 10:26:06.884512 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 10:26:07 crc kubenswrapper[4958]: I1201 10:26:07.636422 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerStarted","Data":"1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30"} Dec 01 10:26:07 crc kubenswrapper[4958]: I1201 10:26:07.663005 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 10:26:08 crc kubenswrapper[4958]: I1201 10:26:08.652297 4958 generic.go:334] "Generic (PLEG): container finished" podID="054057a9-764f-494d-9ae7-387cb051c0dd" containerID="1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30" exitCode=0 Dec 01 10:26:08 crc kubenswrapper[4958]: I1201 10:26:08.652404 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerDied","Data":"1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30"} Dec 01 10:26:10 crc kubenswrapper[4958]: I1201 10:26:10.677998 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerStarted","Data":"88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb"} Dec 01 10:26:10 crc kubenswrapper[4958]: I1201 10:26:10.705521 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ww5c5" podStartSLOduration=3.70921709 podStartE2EDuration="6.705488232s" podCreationTimestamp="2025-12-01 10:26:04 +0000 UTC" firstStartedPulling="2025-12-01 10:26:06.624154769 +0000 UTC m=+1614.132943806" lastFinishedPulling="2025-12-01 10:26:09.620425911 +0000 UTC m=+1617.129214948" observedRunningTime="2025-12-01 10:26:10.700301513 +0000 UTC m=+1618.209090560" watchObservedRunningTime="2025-12-01 10:26:10.705488232 +0000 UTC m=+1618.214277279" Dec 01 10:26:15 crc kubenswrapper[4958]: I1201 10:26:15.080916 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:15 crc kubenswrapper[4958]: I1201 10:26:15.082006 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:15 crc kubenswrapper[4958]: I1201 10:26:15.146225 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:15 crc kubenswrapper[4958]: I1201 10:26:15.810530 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:15 crc kubenswrapper[4958]: I1201 10:26:15.871878 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ww5c5"] Dec 01 10:26:16 crc kubenswrapper[4958]: I1201 10:26:16.798898 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:26:16 crc kubenswrapper[4958]: E1201 10:26:16.799335 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:26:17 crc kubenswrapper[4958]: I1201 10:26:17.768306 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ww5c5" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="registry-server" containerID="cri-o://88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb" gracePeriod=2 Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.252912 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.374939 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-catalog-content\") pod \"054057a9-764f-494d-9ae7-387cb051c0dd\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.375094 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kgs2\" (UniqueName: \"kubernetes.io/projected/054057a9-764f-494d-9ae7-387cb051c0dd-kube-api-access-4kgs2\") pod \"054057a9-764f-494d-9ae7-387cb051c0dd\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.375406 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-utilities\") pod \"054057a9-764f-494d-9ae7-387cb051c0dd\" (UID: \"054057a9-764f-494d-9ae7-387cb051c0dd\") " Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.376798 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-utilities" (OuterVolumeSpecName: "utilities") pod "054057a9-764f-494d-9ae7-387cb051c0dd" (UID: "054057a9-764f-494d-9ae7-387cb051c0dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.386098 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054057a9-764f-494d-9ae7-387cb051c0dd-kube-api-access-4kgs2" (OuterVolumeSpecName: "kube-api-access-4kgs2") pod "054057a9-764f-494d-9ae7-387cb051c0dd" (UID: "054057a9-764f-494d-9ae7-387cb051c0dd"). InnerVolumeSpecName "kube-api-access-4kgs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.441491 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "054057a9-764f-494d-9ae7-387cb051c0dd" (UID: "054057a9-764f-494d-9ae7-387cb051c0dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.477988 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kgs2\" (UniqueName: \"kubernetes.io/projected/054057a9-764f-494d-9ae7-387cb051c0dd-kube-api-access-4kgs2\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.478022 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.478032 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/054057a9-764f-494d-9ae7-387cb051c0dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.783621 4958 generic.go:334] "Generic (PLEG): container finished" podID="054057a9-764f-494d-9ae7-387cb051c0dd" containerID="88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb" exitCode=0 Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.783726 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ww5c5" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.783745 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerDied","Data":"88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb"} Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.784407 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ww5c5" event={"ID":"054057a9-764f-494d-9ae7-387cb051c0dd","Type":"ContainerDied","Data":"2a0f917a2a6bc95fad35941235c648d9f6791ed7bdb12f38bb5903e31409a143"} Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.784456 4958 scope.go:117] "RemoveContainer" containerID="88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.828506 4958 scope.go:117] "RemoveContainer" containerID="1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.833643 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ww5c5"] Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.847187 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ww5c5"] Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.864711 4958 scope.go:117] "RemoveContainer" containerID="1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.908778 4958 scope.go:117] "RemoveContainer" containerID="88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb" Dec 01 10:26:18 crc kubenswrapper[4958]: E1201 10:26:18.909939 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb\": container with ID starting with 88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb not found: ID does not exist" containerID="88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.909991 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb"} err="failed to get container status \"88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb\": rpc error: code = NotFound desc = could not find container \"88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb\": container with ID starting with 88f6b0ebef26670210101db17abf42d66a78356c6fabb0e2ec8d10f3b05635eb not found: ID does not exist" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.910024 4958 scope.go:117] "RemoveContainer" containerID="1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30" Dec 01 10:26:18 crc kubenswrapper[4958]: E1201 10:26:18.910744 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30\": container with ID starting with 1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30 not found: ID does not exist" containerID="1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.910810 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30"} err="failed to get container status \"1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30\": rpc error: code = NotFound desc = could not find container \"1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30\": container with ID starting with 1f9166e121704757363fd230d49cf7543e10db848846c366315e5fc6452e4f30 not found: ID does not exist" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.910869 4958 scope.go:117] "RemoveContainer" containerID="1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6" Dec 01 10:26:18 crc kubenswrapper[4958]: E1201 10:26:18.911254 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6\": container with ID starting with 1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6 not found: ID does not exist" containerID="1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6" Dec 01 10:26:18 crc kubenswrapper[4958]: I1201 10:26:18.911300 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6"} err="failed to get container status \"1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6\": rpc error: code = NotFound desc = could not find container \"1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6\": container with ID starting with 1553e680bb7f089f18237b95b59d31b8d8a18fcb505b7f31299ab0b585dcb2f6 not found: ID does not exist" Dec 01 10:26:19 crc kubenswrapper[4958]: I1201 10:26:19.833086 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" path="/var/lib/kubelet/pods/054057a9-764f-494d-9ae7-387cb051c0dd/volumes" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.247927 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.248853 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a2bd3b82-a2cb-40ac-8a52-6016d62a4040" containerName="openstackclient" containerID="cri-o://1118e86476db84520b916ab9171d00c9ca74a0fda0b9bbab6d99166ab5c31c33" gracePeriod=2 Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.303918 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.662455 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.663617 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="openstack-network-exporter" containerID="cri-o://d6ab967c227e3d5e9e63f9b31bc7367d580da34af75c1d75d3010bf7ca6e1148" gracePeriod=300 Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.696167 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.696469 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" containerID="cri-o://4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" gracePeriod=30 Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.696919 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="openstack-network-exporter" containerID="cri-o://14dbc21dd417bde9c33ac37159b79364136213cf4be75108690ff918465fc7cc" gracePeriod=30 Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.749165 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:26:26 crc kubenswrapper[4958]: E1201 10:26:26.875679 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:26 crc kubenswrapper[4958]: E1201 10:26:26.876252 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data podName:e19ffea8-2e96-4cff-a2ec-40646aaa4cc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:27.37622538 +0000 UTC m=+1634.885014417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0") : configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.880265 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementea59-account-delete-jmj7f"] Dec 01 10:26:26 crc kubenswrapper[4958]: E1201 10:26:26.881032 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="extract-content" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.881055 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="extract-content" Dec 01 10:26:26 crc kubenswrapper[4958]: E1201 10:26:26.881072 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bd3b82-a2cb-40ac-8a52-6016d62a4040" containerName="openstackclient" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.881080 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bd3b82-a2cb-40ac-8a52-6016d62a4040" containerName="openstackclient" Dec 01 10:26:26 crc kubenswrapper[4958]: E1201 10:26:26.881108 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="registry-server" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.881117 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="registry-server" Dec 01 10:26:26 crc kubenswrapper[4958]: E1201 10:26:26.881170 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="extract-utilities" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.881180 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="extract-utilities" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.881445 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="054057a9-764f-494d-9ae7-387cb051c0dd" containerName="registry-server" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.881476 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bd3b82-a2cb-40ac-8a52-6016d62a4040" containerName="openstackclient" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.882638 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.976414 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv924\" (UniqueName: \"kubernetes.io/projected/3f7f1749-5fab-4187-8675-01747669c1b7-kube-api-access-nv924\") pod \"placementea59-account-delete-jmj7f\" (UID: \"3f7f1749-5fab-4187-8675-01747669c1b7\") " pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:26 crc kubenswrapper[4958]: I1201 10:26:26.981549 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6864p"] Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.052779 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6864p"] Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.078477 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv924\" (UniqueName: \"kubernetes.io/projected/3f7f1749-5fab-4187-8675-01747669c1b7-kube-api-access-nv924\") pod \"placementea59-account-delete-jmj7f\" (UID: \"3f7f1749-5fab-4187-8675-01747669c1b7\") " pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.145981 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementea59-account-delete-jmj7f"] Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.234818 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv924\" (UniqueName: \"kubernetes.io/projected/3f7f1749-5fab-4187-8675-01747669c1b7-kube-api-access-nv924\") pod \"placementea59-account-delete-jmj7f\" (UID: \"3f7f1749-5fab-4187-8675-01747669c1b7\") " pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.248152 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="ovsdbserver-sb" containerID="cri-o://892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27" gracePeriod=300 Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.275029 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican5432-account-delete-z4wgz"] Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.276810 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.340393 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican5432-account-delete-z4wgz"] Dec 01 10:26:27 crc kubenswrapper[4958]: I1201 10:26:27.441483 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.471695 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.471862 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data podName:e19ffea8-2e96-4cff-a2ec-40646aaa4cc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:28.471805896 +0000 UTC m=+1635.980594933 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0") : configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.763773 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.789338 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.793301 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.793408 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" Dec 01 10:26:27 crc kubenswrapper[4958]: E1201 10:26:27.832381 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ae2c66_9d6d_4bc0_b3fe_ee729225e85f.slice/crio-conmon-14dbc21dd417bde9c33ac37159b79364136213cf4be75108690ff918465fc7cc.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.086021 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnwt\" (UniqueName: \"kubernetes.io/projected/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc-kube-api-access-gpnwt\") pod \"barbican5432-account-delete-z4wgz\" (UID: \"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc\") " pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.094275 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a7373f-c8a8-4721-bf49-1ffc1887309e" path="/var/lib/kubelet/pods/88a7373f-c8a8-4721-bf49-1ffc1887309e/volumes" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.096225 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6fscs"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.096297 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-d9qck"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.097365 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6fscs" podUID="7296e8da-30a1-4c69-978f-3411bda327f7" containerName="openstack-network-exporter" containerID="cri-o://d39d333c260ccb07db8a021dde9b4a9a746995835c4970968df5c849398cd2e8" gracePeriod=30 Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.140215 4958 generic.go:334] "Generic (PLEG): container finished" podID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerID="14dbc21dd417bde9c33ac37159b79364136213cf4be75108690ff918465fc7cc" exitCode=2 Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.140299 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f","Type":"ContainerDied","Data":"14dbc21dd417bde9c33ac37159b79364136213cf4be75108690ff918465fc7cc"} Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.164281 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_debf8fae-f15c-4b09-b185-3f47c7e0491b/ovsdbserver-sb/0.log" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.164342 4958 generic.go:334] "Generic (PLEG): container finished" podID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerID="d6ab967c227e3d5e9e63f9b31bc7367d580da34af75c1d75d3010bf7ca6e1148" exitCode=2 Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.164363 4958 generic.go:334] "Generic (PLEG): container finished" podID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerID="892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27" exitCode=143 Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.164389 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"debf8fae-f15c-4b09-b185-3f47c7e0491b","Type":"ContainerDied","Data":"d6ab967c227e3d5e9e63f9b31bc7367d580da34af75c1d75d3010bf7ca6e1148"} Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.164420 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"debf8fae-f15c-4b09-b185-3f47c7e0491b","Type":"ContainerDied","Data":"892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27"} Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.172495 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-xr8kd"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.203467 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnwt\" (UniqueName: \"kubernetes.io/projected/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc-kube-api-access-gpnwt\") pod \"barbican5432-account-delete-z4wgz\" (UID: \"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc\") " pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.261142 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8zf5v"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.337818 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnwt\" (UniqueName: \"kubernetes.io/projected/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc-kube-api-access-gpnwt\") pod \"barbican5432-account-delete-z4wgz\" (UID: \"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc\") " pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.367108 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27 is running failed: container process not found" containerID="892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.370398 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27 is running failed: container process not found" containerID="892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.372502 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4p4kj"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.372796 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" containerName="dnsmasq-dns" containerID="cri-o://7331ebc18b17e0727114d123f2215ce8c358366e2d09e664d2eac5884cbc980a" gracePeriod=10 Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.378310 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27 is running failed: container process not found" containerID="892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.378417 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="ovsdbserver-sb" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.458938 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8zf5v"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.524396 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qk422"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.544586 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.547918 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.548009 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data podName:e19ffea8-2e96-4cff-a2ec-40646aaa4cc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:30.547980703 +0000 UTC m=+1638.056769740 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0") : configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.675980 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qk422"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.732331 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder259f-account-delete-zpd7t"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.734134 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.751074 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d6888456-hv67t"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.751520 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8d6888456-hv67t" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-log" containerID="cri-o://a5061e1758cadcbddcf9d7fd1235664eaefd5fcc99c2a0707ae4e34496238238" gracePeriod=30 Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.751742 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8d6888456-hv67t" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-api" containerID="cri-o://944f885a93b277fd5b446c26c7b9c7db505641ce99d66fc0eee7275265da1587" gracePeriod=30 Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.763057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zch\" (UniqueName: \"kubernetes.io/projected/66469344-8c32-45d9-afc4-91dcb9dbe807-kube-api-access-q7zch\") pod \"cinder259f-account-delete-zpd7t\" (UID: \"66469344-8c32-45d9-afc4-91dcb9dbe807\") " pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.782924 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder259f-account-delete-zpd7t"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.799433 4958 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/glance-default-external-api-0" secret="" err="secret \"glance-glance-dockercfg-smrjw\" not found" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.801743 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:26:28 crc kubenswrapper[4958]: E1201 10:26:28.801998 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.815949 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.899103 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zch\" (UniqueName: \"kubernetes.io/projected/66469344-8c32-45d9-afc4-91dcb9dbe807-kube-api-access-q7zch\") pod \"cinder259f-account-delete-zpd7t\" (UID: \"66469344-8c32-45d9-afc4-91dcb9dbe807\") " pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.901210 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cktqn"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.963047 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutrona095-account-delete-wwt74"] Dec 01 10:26:28 crc kubenswrapper[4958]: I1201 10:26:28.964761 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:28.999019 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zch\" (UniqueName: \"kubernetes.io/projected/66469344-8c32-45d9-afc4-91dcb9dbe807-kube-api-access-q7zch\") pod \"cinder259f-account-delete-zpd7t\" (UID: \"66469344-8c32-45d9-afc4-91dcb9dbe807\") " pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.033110 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cktqn"] Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.033262 4958 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.033318 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:29.533300178 +0000 UTC m=+1637.042089215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-scripts" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.024586 4958 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.051126 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:29.55108005 +0000 UTC m=+1637.059869087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-default-external-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.069672 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.096346 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutrona095-account-delete-wwt74"] Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.129092 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.129624 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data podName:4fccd607-3bfb-4593-a6de-6a0fc52b34ea nodeName:}" failed. No retries permitted until 2025-12-01 10:26:29.629604868 +0000 UTC m=+1637.138393905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data") pod "rabbitmq-server-0" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea") : configmap "rabbitmq-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.157708 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancea8f6-account-delete-cjlz4"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.170444 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.189941 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancea8f6-account-delete-cjlz4"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.212387 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.213364 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="openstack-network-exporter" containerID="cri-o://7785733222ee379e13b2e08924fc0b23055d37143a5248373e74dc7a5b530c74" gracePeriod=300 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.227738 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6xkm\" (UniqueName: \"kubernetes.io/projected/34129d60-4706-41fe-aa19-f2a13f38713a-kube-api-access-w6xkm\") pod \"neutrona095-account-delete-wwt74\" (UID: \"34129d60-4706-41fe-aa19-f2a13f38713a\") " pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.250862 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6fscs_7296e8da-30a1-4c69-978f-3411bda327f7/openstack-network-exporter/0.log" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.250949 4958 generic.go:334] "Generic (PLEG): container finished" podID="7296e8da-30a1-4c69-978f-3411bda327f7" containerID="d39d333c260ccb07db8a021dde9b4a9a746995835c4970968df5c849398cd2e8" exitCode=2 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.251093 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6fscs" event={"ID":"7296e8da-30a1-4c69-978f-3411bda327f7","Type":"ContainerDied","Data":"d39d333c260ccb07db8a021dde9b4a9a746995835c4970968df5c849398cd2e8"} Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.257182 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ksr7n"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.259287 4958 generic.go:334] "Generic (PLEG): container finished" podID="207148af-4b76-49b6-80cc-883ec14bb268" containerID="a5061e1758cadcbddcf9d7fd1235664eaefd5fcc99c2a0707ae4e34496238238" exitCode=143 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.259348 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6888456-hv67t" event={"ID":"207148af-4b76-49b6-80cc-883ec14bb268","Type":"ContainerDied","Data":"a5061e1758cadcbddcf9d7fd1235664eaefd5fcc99c2a0707ae4e34496238238"} Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.260492 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementea59-account-delete-jmj7f" event={"ID":"3f7f1749-5fab-4187-8675-01747669c1b7","Type":"ContainerStarted","Data":"b0be9331d12d466ad9a8f3a50d26755777ca7dcfceaefdf2e3635069dd4f2cf5"} Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.261623 4958 generic.go:334] "Generic (PLEG): container finished" podID="a2bd3b82-a2cb-40ac-8a52-6016d62a4040" containerID="1118e86476db84520b916ab9171d00c9ca74a0fda0b9bbab6d99166ab5c31c33" exitCode=137 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.262980 4958 generic.go:334] "Generic (PLEG): container finished" podID="7e1110df-28ad-4b93-ad3b-54d771229959" containerID="7331ebc18b17e0727114d123f2215ce8c358366e2d09e664d2eac5884cbc980a" exitCode=0 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.263007 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" event={"ID":"7e1110df-28ad-4b93-ad3b-54d771229959","Type":"ContainerDied","Data":"7331ebc18b17e0727114d123f2215ce8c358366e2d09e664d2eac5884cbc980a"} Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.285457 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ksr7n"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.296555 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jkdpw"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.315027 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="ovsdbserver-nb" containerID="cri-o://9f86a4fe20ec5f769cd7eb9953f1018672b10ce2765e2dc5eb42d2f7a1374f18" gracePeriod=300 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.330602 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqmxf\" (UniqueName: \"kubernetes.io/projected/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b-kube-api-access-mqmxf\") pod \"glancea8f6-account-delete-cjlz4\" (UID: \"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b\") " pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.330737 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6xkm\" (UniqueName: \"kubernetes.io/projected/34129d60-4706-41fe-aa19-f2a13f38713a-kube-api-access-w6xkm\") pod \"neutrona095-account-delete-wwt74\" (UID: \"34129d60-4706-41fe-aa19-f2a13f38713a\") " pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.352310 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jkdpw"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.400345 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.401257 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-server" containerID="cri-o://e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.401957 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="swift-recon-cron" containerID="cri-o://2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402014 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="rsync" containerID="cri-o://f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402061 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-expirer" containerID="cri-o://95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402313 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-updater" containerID="cri-o://4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402359 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-auditor" containerID="cri-o://e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402397 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-replicator" containerID="cri-o://c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402442 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-server" containerID="cri-o://45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402488 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-updater" containerID="cri-o://52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402552 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-auditor" containerID="cri-o://c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402602 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-replicator" containerID="cri-o://f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402644 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-server" containerID="cri-o://865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402756 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-reaper" containerID="cri-o://52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402805 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-auditor" containerID="cri-o://01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.402924 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-replicator" containerID="cri-o://57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.471262 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6xkm\" (UniqueName: \"kubernetes.io/projected/34129d60-4706-41fe-aa19-f2a13f38713a-kube-api-access-w6xkm\") pod \"neutrona095-account-delete-wwt74\" (UID: \"34129d60-4706-41fe-aa19-f2a13f38713a\") " pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.488795 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqmxf\" (UniqueName: \"kubernetes.io/projected/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b-kube-api-access-mqmxf\") pod \"glancea8f6-account-delete-cjlz4\" (UID: \"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b\") " pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.525564 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqmxf\" (UniqueName: \"kubernetes.io/projected/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b-kube-api-access-mqmxf\") pod \"glancea8f6-account-delete-cjlz4\" (UID: \"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b\") " pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.534665 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.597327 4958 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.597430 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:30.59740839 +0000 UTC m=+1638.106197427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-default-external-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.597913 4958 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.597949 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:30.597940775 +0000 UTC m=+1638.106729812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-scripts" not found Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.598413 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnnl2"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.622754 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.623196 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="cinder-scheduler" containerID="cri-o://9edf6c1c15c8c61a73a71e42e83b06f5d03b0b267c8f2a39955fe70dd308453e" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.624020 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="probe" containerID="cri-o://41cd43dae598fc0fb9dde4e0161ce7d640d805585cd73ef9e8521ddf5b485506" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.654122 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-2f5lj"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.679834 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnnl2"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.694199 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.702159 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.702310 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data podName:4fccd607-3bfb-4593-a6de-6a0fc52b34ea nodeName:}" failed. No retries permitted until 2025-12-01 10:26:30.702285866 +0000 UTC m=+1638.211074893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data") pod "rabbitmq-server-0" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea") : configmap "rabbitmq-config-data" not found Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.712074 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-2f5lj"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.782340 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.782751 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api-log" containerID="cri-o://0384d0b99f1562b0d40019ea6259c6fc06fa88bdab2bd09c91cd469a8040914c" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.782995 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api" containerID="cri-o://65331563d699edd62e898ec079b7208d63c4336ed333b4a2cad00c932ee16ea5" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.841707 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.856368 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2433ef78-8e78-439b-a64c-1052adbeed62" path="/var/lib/kubelet/pods/2433ef78-8e78-439b-a64c-1052adbeed62/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.857237 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6404e65b-16c6-4bd0-a4fe-26ad44d722c6" path="/var/lib/kubelet/pods/6404e65b-16c6-4bd0-a4fe-26ad44d722c6/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.875310 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ffec42-04be-4d06-b6d8-44c5b1eb1d53" path="/var/lib/kubelet/pods/81ffec42-04be-4d06-b6d8-44c5b1eb1d53/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.876449 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dc24bc-1938-490b-a88a-286e6af1d269" path="/var/lib/kubelet/pods/a3dc24bc-1938-490b-a88a-286e6af1d269/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.882319 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a947fbcc-a378-4d94-a37b-75164b9e2746" path="/var/lib/kubelet/pods/a947fbcc-a378-4d94-a37b-75164b9e2746/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.883058 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b69312c4-bb8d-4272-bd60-77f355f83f25" path="/var/lib/kubelet/pods/b69312c4-bb8d-4272-bd60-77f355f83f25/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.883919 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f960833c-3f07-4613-8d69-b7c563b3dd5d" path="/var/lib/kubelet/pods/f960833c-3f07-4613-8d69-b7c563b3dd5d/volumes" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.885312 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi1865-account-delete-hmcpq"] Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.885718 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" containerName="dnsmasq-dns" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.885737 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" containerName="dnsmasq-dns" Dec 01 10:26:29 crc kubenswrapper[4958]: E1201 10:26:29.885769 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" containerName="init" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.885776 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" containerName="init" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.886020 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" containerName="dnsmasq-dns" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.886753 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-768454b56f-84xc8"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.887157 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-768454b56f-84xc8" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-api" containerID="cri-o://7e4c8d5d756ae1f1f79baf6ab39a0885d96760214ce8bd8adfe3bf8d0265ce4e" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.887824 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.888348 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-768454b56f-84xc8" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-httpd" containerID="cri-o://b1011c74e73144076e25dd1c4df809bab366b279b9f967a7d28bfcadc81ae769" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.912400 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi1865-account-delete-hmcpq"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.987111 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.988237 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-log" containerID="cri-o://5df0d7f8f330d79dac217b44ff6f358c9415547f0327ca2889719598f61e6c85" gracePeriod=30 Dec 01 10:26:29 crc kubenswrapper[4958]: I1201 10:26:29.989082 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-httpd" containerID="cri-o://9144cb56a423613b84457fe40b192f906f9bb58de4d5fb7b219e438f9744439a" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.008861 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-d7bmj"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.016422 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-sb\") pod \"7e1110df-28ad-4b93-ad3b-54d771229959\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.018632 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-svc\") pod \"7e1110df-28ad-4b93-ad3b-54d771229959\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.018684 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-config\") pod \"7e1110df-28ad-4b93-ad3b-54d771229959\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.018746 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-swift-storage-0\") pod \"7e1110df-28ad-4b93-ad3b-54d771229959\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.020054 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r5nn\" (UniqueName: \"kubernetes.io/projected/7e1110df-28ad-4b93-ad3b-54d771229959-kube-api-access-6r5nn\") pod \"7e1110df-28ad-4b93-ad3b-54d771229959\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.020156 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-nb\") pod \"7e1110df-28ad-4b93-ad3b-54d771229959\" (UID: \"7e1110df-28ad-4b93-ad3b-54d771229959\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.020585 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pszxx\" (UniqueName: \"kubernetes.io/projected/f20641f2-49c0-4492-8f1f-20b14a1f3bd3-kube-api-access-pszxx\") pod \"novaapi1865-account-delete-hmcpq\" (UID: \"f20641f2-49c0-4492-8f1f-20b14a1f3bd3\") " pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.028188 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-d7bmj"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.036627 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.037199 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-log" containerID="cri-o://7f2380f1568f2f9c175fd6ffac5bc9a4006ac5eca2dd568b473ca89bfac956f6" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.037450 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-httpd" containerID="cri-o://2f99a0981f6f8c18ad3b38420c231b904981b9a55694320467486987165f628d" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.045782 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1110df-28ad-4b93-ad3b-54d771229959-kube-api-access-6r5nn" (OuterVolumeSpecName: "kube-api-access-6r5nn") pod "7e1110df-28ad-4b93-ad3b-54d771229959" (UID: "7e1110df-28ad-4b93-ad3b-54d771229959"). InnerVolumeSpecName "kube-api-access-6r5nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.078292 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5432-account-create-2x7ph"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.122307 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_debf8fae-f15c-4b09-b185-3f47c7e0491b/ovsdbserver-sb/0.log" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.122466 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.123142 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pszxx\" (UniqueName: \"kubernetes.io/projected/f20641f2-49c0-4492-8f1f-20b14a1f3bd3-kube-api-access-pszxx\") pod \"novaapi1865-account-delete-hmcpq\" (UID: \"f20641f2-49c0-4492-8f1f-20b14a1f3bd3\") " pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.123234 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r5nn\" (UniqueName: \"kubernetes.io/projected/7e1110df-28ad-4b93-ad3b-54d771229959-kube-api-access-6r5nn\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.143439 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5432-account-create-2x7ph"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.154481 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6xc7n"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.173158 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.227165 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5432-account-delete-z4wgz"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.260069 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pszxx\" (UniqueName: \"kubernetes.io/projected/f20641f2-49c0-4492-8f1f-20b14a1f3bd3-kube-api-access-pszxx\") pod \"novaapi1865-account-delete-hmcpq\" (UID: \"f20641f2-49c0-4492-8f1f-20b14a1f3bd3\") " pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.274597 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.304028 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" containerID="cri-o://dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" gracePeriod=28 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.322125 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6xc7n"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327143 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-scripts\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327229 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-combined-ca-bundle\") pod \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327284 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlks4\" (UniqueName: \"kubernetes.io/projected/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-kube-api-access-wlks4\") pod \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327375 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-config\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327605 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config-secret\") pod \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327671 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdb-rundir\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327709 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-combined-ca-bundle\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327745 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zfjp\" (UniqueName: \"kubernetes.io/projected/debf8fae-f15c-4b09-b185-3f47c7e0491b-kube-api-access-7zfjp\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327795 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.327891 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config\") pod \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\" (UID: \"a2bd3b82-a2cb-40ac-8a52-6016d62a4040\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.328077 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-metrics-certs-tls-certs\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.328242 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdbserver-sb-tls-certs\") pod \"debf8fae-f15c-4b09-b185-3f47c7e0491b\" (UID: \"debf8fae-f15c-4b09-b185-3f47c7e0491b\") " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.330040 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.331066 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-scripts" (OuterVolumeSpecName: "scripts") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.331716 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.331737 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.349927 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.350103 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.350325 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener-log" containerID="cri-o://d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.350479 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener" containerID="cri-o://beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.359904 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-kube-api-access-wlks4" (OuterVolumeSpecName: "kube-api-access-wlks4") pod "a2bd3b82-a2cb-40ac-8a52-6016d62a4040" (UID: "a2bd3b82-a2cb-40ac-8a52-6016d62a4040"). InnerVolumeSpecName "kube-api-access-wlks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.361037 4958 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 01 10:26:30 crc kubenswrapper[4958]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 01 10:26:30 crc kubenswrapper[4958]: + source /usr/local/bin/container-scripts/functions Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNBridge=br-int Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNRemote=tcp:localhost:6642 Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNEncapType=geneve Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNAvailabilityZones= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ EnableChassisAsGateway=true Dec 01 10:26:30 crc kubenswrapper[4958]: ++ PhysicalNetworks= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNHostName= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 01 10:26:30 crc kubenswrapper[4958]: ++ ovs_dir=/var/lib/openvswitch Dec 01 10:26:30 crc kubenswrapper[4958]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 01 10:26:30 crc kubenswrapper[4958]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 01 10:26:30 crc kubenswrapper[4958]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + cleanup_ovsdb_server_semaphore Dec 01 10:26:30 crc kubenswrapper[4958]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 01 10:26:30 crc kubenswrapper[4958]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 01 10:26:30 crc kubenswrapper[4958]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-xr8kd" message=< Dec 01 10:26:30 crc kubenswrapper[4958]: Exiting ovsdb-server (5) [ OK ] Dec 01 10:26:30 crc kubenswrapper[4958]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 01 10:26:30 crc kubenswrapper[4958]: + source /usr/local/bin/container-scripts/functions Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNBridge=br-int Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNRemote=tcp:localhost:6642 Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNEncapType=geneve Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNAvailabilityZones= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ EnableChassisAsGateway=true Dec 01 10:26:30 crc kubenswrapper[4958]: ++ PhysicalNetworks= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNHostName= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 01 10:26:30 crc kubenswrapper[4958]: ++ ovs_dir=/var/lib/openvswitch Dec 01 10:26:30 crc kubenswrapper[4958]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 01 10:26:30 crc kubenswrapper[4958]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 01 10:26:30 crc kubenswrapper[4958]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + cleanup_ovsdb_server_semaphore Dec 01 10:26:30 crc kubenswrapper[4958]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 01 10:26:30 crc kubenswrapper[4958]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 01 10:26:30 crc kubenswrapper[4958]: > Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.361080 4958 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 01 10:26:30 crc kubenswrapper[4958]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 01 10:26:30 crc kubenswrapper[4958]: + source /usr/local/bin/container-scripts/functions Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNBridge=br-int Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNRemote=tcp:localhost:6642 Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNEncapType=geneve Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNAvailabilityZones= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ EnableChassisAsGateway=true Dec 01 10:26:30 crc kubenswrapper[4958]: ++ PhysicalNetworks= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ OVNHostName= Dec 01 10:26:30 crc kubenswrapper[4958]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 01 10:26:30 crc kubenswrapper[4958]: ++ ovs_dir=/var/lib/openvswitch Dec 01 10:26:30 crc kubenswrapper[4958]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 01 10:26:30 crc kubenswrapper[4958]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 01 10:26:30 crc kubenswrapper[4958]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + sleep 0.5 Dec 01 10:26:30 crc kubenswrapper[4958]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 01 10:26:30 crc kubenswrapper[4958]: + cleanup_ovsdb_server_semaphore Dec 01 10:26:30 crc kubenswrapper[4958]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 01 10:26:30 crc kubenswrapper[4958]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 01 10:26:30 crc kubenswrapper[4958]: > pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" containerID="cri-o://d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.361124 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" containerID="cri-o://d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" gracePeriod=28 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.361941 4958 generic.go:334] "Generic (PLEG): container finished" podID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerID="5df0d7f8f330d79dac217b44ff6f358c9415547f0327ca2889719598f61e6c85" exitCode=143 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.362010 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30e8c723-a7a0-4697-8369-bd224fcfdf3f","Type":"ContainerDied","Data":"5df0d7f8f330d79dac217b44ff6f358c9415547f0327ca2889719598f61e6c85"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.362937 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debf8fae-f15c-4b09-b185-3f47c7e0491b-kube-api-access-7zfjp" (OuterVolumeSpecName: "kube-api-access-7zfjp") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "kube-api-access-7zfjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.363714 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-config" (OuterVolumeSpecName: "config") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.363930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e1110df-28ad-4b93-ad3b-54d771229959" (UID: "7e1110df-28ad-4b93-ad3b-54d771229959"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.366536 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.375161 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-config" (OuterVolumeSpecName: "config") pod "7e1110df-28ad-4b93-ad3b-54d771229959" (UID: "7e1110df-28ad-4b93-ad3b-54d771229959"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.377867 4958 scope.go:117] "RemoveContainer" containerID="1118e86476db84520b916ab9171d00c9ca74a0fda0b9bbab6d99166ab5c31c33" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.378076 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.378255 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e1110df-28ad-4b93-ad3b-54d771229959" (UID: "7e1110df-28ad-4b93-ad3b-54d771229959"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.383433 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ea59-account-create-tkkkr"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.389438 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e1110df-28ad-4b93-ad3b-54d771229959" (UID: "7e1110df-28ad-4b93-ad3b-54d771229959"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.411715 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" event={"ID":"7e1110df-28ad-4b93-ad3b-54d771229959","Type":"ContainerDied","Data":"01d0431fa6b569a7e97c7666632005762608c83ec16ae1df82e372bfd9881aad"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.412337 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4p4kj" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.420000 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ea59-account-create-tkkkr"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442632 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442667 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zfjp\" (UniqueName: \"kubernetes.io/projected/debf8fae-f15c-4b09-b185-3f47c7e0491b-kube-api-access-7zfjp\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442679 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442709 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442721 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442733 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlks4\" (UniqueName: \"kubernetes.io/projected/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-kube-api-access-wlks4\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442766 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/debf8fae-f15c-4b09-b185-3f47c7e0491b-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.442775 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.463972 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_032af861-fde8-4b7a-929b-2ec7f5871474/ovsdbserver-nb/0.log" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.464047 4958 generic.go:334] "Generic (PLEG): container finished" podID="032af861-fde8-4b7a-929b-2ec7f5871474" containerID="7785733222ee379e13b2e08924fc0b23055d37143a5248373e74dc7a5b530c74" exitCode=2 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.464077 4958 generic.go:334] "Generic (PLEG): container finished" podID="032af861-fde8-4b7a-929b-2ec7f5871474" containerID="9f86a4fe20ec5f769cd7eb9953f1018672b10ce2765e2dc5eb42d2f7a1374f18" exitCode=143 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.464254 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"032af861-fde8-4b7a-929b-2ec7f5871474","Type":"ContainerDied","Data":"7785733222ee379e13b2e08924fc0b23055d37143a5248373e74dc7a5b530c74"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.464324 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"032af861-fde8-4b7a-929b-2ec7f5871474","Type":"ContainerDied","Data":"9f86a4fe20ec5f769cd7eb9953f1018672b10ce2765e2dc5eb42d2f7a1374f18"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.473033 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2bd3b82-a2cb-40ac-8a52-6016d62a4040" (UID: "a2bd3b82-a2cb-40ac-8a52-6016d62a4040"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.488284 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementea59-account-delete-jmj7f"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.564708 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.575866 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sj694"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.576177 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_debf8fae-f15c-4b09-b185-3f47c7e0491b/ovsdbserver-sb/0.log" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.576341 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"debf8fae-f15c-4b09-b185-3f47c7e0491b","Type":"ContainerDied","Data":"ecbba6144a3f24846eefb26247c695838d6f0606ba3501a3acdab236ef9721e9"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.576468 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.576477 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerName="rabbitmq" containerID="cri-o://f28e0a79ecd1735ef6ab0d156677914d027c13581eb5f43efb38a35773f7c70d" gracePeriod=604800 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.631282 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e1110df-28ad-4b93-ad3b-54d771229959" (UID: "7e1110df-28ad-4b93-ad3b-54d771229959"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.635645 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a2bd3b82-a2cb-40ac-8a52-6016d62a4040" (UID: "a2bd3b82-a2cb-40ac-8a52-6016d62a4040"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.648004 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.649106 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementea59-account-delete-jmj7f" event={"ID":"3f7f1749-5fab-4187-8675-01747669c1b7","Type":"ContainerStarted","Data":"b8d3ef2ad4ede695d83613970dd3bc8a0788b09e8f0a4653b81852375759006f"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.673574 4958 generic.go:334] "Generic (PLEG): container finished" podID="137d864e-34d9-452c-91b0-179a93198b0f" containerID="0384d0b99f1562b0d40019ea6259c6fc06fa88bdab2bd09c91cd469a8040914c" exitCode=143 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.673779 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"137d864e-34d9-452c-91b0-179a93198b0f","Type":"ContainerDied","Data":"0384d0b99f1562b0d40019ea6259c6fc06fa88bdab2bd09c91cd469a8040914c"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.701778 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.711113 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.711263 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.711419 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.711508 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.711611 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.712834 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.712917 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.712983 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713057 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713113 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713162 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713227 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.703938 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-744fbbd578-7c5pc"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713522 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713601 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713660 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713768 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713829 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713902 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.713963 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714122 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714192 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714311 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714366 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714426 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.714484 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.706009 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.715112 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-744fbbd578-7c5pc" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api-log" containerID="cri-o://2c69fba39bcc23fd51b0e44fe893c7a695b06edee1de7171ac930180b17f1ebe" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.715208 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-744fbbd578-7c5pc" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api" containerID="cri-o://8d7386e6d905cc7e682cbc9fcbfef1b78be385a0409c687cf999a6495aba6952" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.719057 4958 generic.go:334] "Generic (PLEG): container finished" podID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerID="b1011c74e73144076e25dd1c4df809bab366b279b9f967a7d28bfcadc81ae769" exitCode=0 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.720233 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-768454b56f-84xc8" event={"ID":"305e96f8-0597-4c64-9026-6d6f2aa454d4","Type":"ContainerDied","Data":"b1011c74e73144076e25dd1c4df809bab366b279b9f967a7d28bfcadc81ae769"} Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.787091 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a2bd3b82-a2cb-40ac-8a52-6016d62a4040" (UID: "a2bd3b82-a2cb-40ac-8a52-6016d62a4040"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.787777 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.803090 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.803398 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data podName:e19ffea8-2e96-4cff-a2ec-40646aaa4cc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:34.803367247 +0000 UTC m=+1642.312156284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0") : configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.803898 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-85fc6f9f59-gdn47"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.804607 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-85fc6f9f59-gdn47" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker-log" containerID="cri-o://0e6afa27cac7b16ea18ffb082028d7d4a5d1af380112b0080ebda5f523c8d154" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.804626 4958 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.806149 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:32.806137337 +0000 UTC m=+1640.314926374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-default-external-config-data" not found Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.805393 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-85fc6f9f59-gdn47" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker" containerID="cri-o://846a70f76927873aca913fa654d135f5d9f7402003f4d45aa4fb853953fe65db" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.806467 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.806618 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data podName:4fccd607-3bfb-4593-a6de-6a0fc52b34ea nodeName:}" failed. No retries permitted until 2025-12-01 10:26:32.8066082 +0000 UTC m=+1640.315397237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data") pod "rabbitmq-server-0" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea") : configmap "rabbitmq-config-data" not found Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.806696 4958 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Dec 01 10:26:30 crc kubenswrapper[4958]: E1201 10:26:30.807710 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:32.807700102 +0000 UTC m=+1640.316489139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-scripts" not found Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.809955 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.810122 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.810141 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.810158 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.810208 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a2bd3b82-a2cb-40ac-8a52-6016d62a4040-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.810225 4958 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e1110df-28ad-4b93-ad3b-54d771229959-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.831898 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "debf8fae-f15c-4b09-b185-3f47c7e0491b" (UID: "debf8fae-f15c-4b09-b185-3f47c7e0491b"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.851344 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-259f-account-create-fmfb6"] Dec 01 10:26:30 crc kubenswrapper[4958]: W1201 10:26:30.867419 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66469344_8c32_45d9_afc4_91dcb9dbe807.slice/crio-371761547cfda20e7086a76636693a3a4a8aaa766c985fa2cc253c8911e2c55f WatchSource:0}: Error finding container 371761547cfda20e7086a76636693a3a4a8aaa766c985fa2cc253c8911e2c55f: Status 404 returned error can't find the container with id 371761547cfda20e7086a76636693a3a4a8aaa766c985fa2cc253c8911e2c55f Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.873654 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sj694"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.874545 4958 scope.go:117] "RemoveContainer" containerID="7331ebc18b17e0727114d123f2215ce8c358366e2d09e664d2eac5884cbc980a" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.878337 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6fscs_7296e8da-30a1-4c69-978f-3411bda327f7/openstack-network-exporter/0.log" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.882309 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.890280 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-259f-account-create-fmfb6"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.897243 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_032af861-fde8-4b7a-929b-2ec7f5871474/ovsdbserver-nb/0.log" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.897429 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.906644 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder259f-account-delete-zpd7t"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.915606 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/debf8fae-f15c-4b09-b185-3f47c7e0491b-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.949640 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w85js"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.989469 4958 scope.go:117] "RemoveContainer" containerID="06d8019ce9173b9ad8867a99fdfaebe7781104806d5ab186f185a02eb0a3a524" Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.992509 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.993044 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" containerID="cri-o://ca5f316669c8cd32825cef769b0e3e1c6eab56cc41d4a4cff5edd8cd75e0ad40" gracePeriod=30 Dec 01 10:26:30 crc kubenswrapper[4958]: I1201 10:26:30.993310 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" containerID="cri-o://be4dcfae53ebf0f5679008f3ab2b2488e6eec364d8b4f3e7506d7a1e58ea2248" gracePeriod=30 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.008811 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.010039 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-log" containerID="cri-o://c5d92e349f45e4a0af02c0d2ddcebd7a3e9e9b1433ebbd079fd3f0a98c18c1b3" gracePeriod=30 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.010660 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-api" containerID="cri-o://a6094764dd390e4786a93ea3889e9a8d68d2c1d8357ff0e022f148ebe28f5c52" gracePeriod=30 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.019992 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovn-rundir\") pod \"7296e8da-30a1-4c69-978f-3411bda327f7\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020090 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-846sn\" (UniqueName: \"kubernetes.io/projected/7296e8da-30a1-4c69-978f-3411bda327f7-kube-api-access-846sn\") pod \"7296e8da-30a1-4c69-978f-3411bda327f7\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020115 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7296e8da-30a1-4c69-978f-3411bda327f7" (UID: "7296e8da-30a1-4c69-978f-3411bda327f7"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020196 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7296e8da-30a1-4c69-978f-3411bda327f7-config\") pod \"7296e8da-30a1-4c69-978f-3411bda327f7\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020229 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-combined-ca-bundle\") pod \"7296e8da-30a1-4c69-978f-3411bda327f7\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020264 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkpr5\" (UniqueName: \"kubernetes.io/projected/032af861-fde8-4b7a-929b-2ec7f5871474-kube-api-access-bkpr5\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-metrics-certs-tls-certs\") pod \"7296e8da-30a1-4c69-978f-3411bda327f7\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020423 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-metrics-certs-tls-certs\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020450 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdb-rundir\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020466 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020498 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovs-rundir\") pod \"7296e8da-30a1-4c69-978f-3411bda327f7\" (UID: \"7296e8da-30a1-4c69-978f-3411bda327f7\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.020542 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-config\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.021654 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdbserver-nb-tls-certs\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.021739 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-combined-ca-bundle\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.021776 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-scripts\") pod \"032af861-fde8-4b7a-929b-2ec7f5871474\" (UID: \"032af861-fde8-4b7a-929b-2ec7f5871474\") " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.029429 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7296e8da-30a1-4c69-978f-3411bda327f7-config" (OuterVolumeSpecName: "config") pod "7296e8da-30a1-4c69-978f-3411bda327f7" (UID: "7296e8da-30a1-4c69-978f-3411bda327f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.031833 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-scripts" (OuterVolumeSpecName: "scripts") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.034903 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "7296e8da-30a1-4c69-978f-3411bda327f7" (UID: "7296e8da-30a1-4c69-978f-3411bda327f7"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.035591 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.039455 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-config" (OuterVolumeSpecName: "config") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: W1201 10:26:31.041765 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadbb9949_a0ed_4f1a_805c_fedba7ec3f1b.slice/crio-48656496f6aa110cb2782f7efdedf6564ef983b71ec40162f3d15ff58a701633 WatchSource:0}: Error finding container 48656496f6aa110cb2782f7efdedf6564ef983b71ec40162f3d15ff58a701633: Status 404 returned error can't find the container with id 48656496f6aa110cb2782f7efdedf6564ef983b71ec40162f3d15ff58a701633 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.030291 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.049869 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.059801 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9sz8p"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.089031 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6600-account-create-48jzj"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.093597 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7296e8da-30a1-4c69-978f-3411bda327f7-kube-api-access-846sn" (OuterVolumeSpecName: "kube-api-access-846sn") pod "7296e8da-30a1-4c69-978f-3411bda327f7" (UID: "7296e8da-30a1-4c69-978f-3411bda327f7"). InnerVolumeSpecName "kube-api-access-846sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.102504 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032af861-fde8-4b7a-929b-2ec7f5871474-kube-api-access-bkpr5" (OuterVolumeSpecName: "kube-api-access-bkpr5") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "kube-api-access-bkpr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.102604 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w85js"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152074 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7296e8da-30a1-4c69-978f-3411bda327f7-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152112 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkpr5\" (UniqueName: \"kubernetes.io/projected/032af861-fde8-4b7a-929b-2ec7f5871474-kube-api-access-bkpr5\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152122 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152149 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152158 4958 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7296e8da-30a1-4c69-978f-3411bda327f7-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152167 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152175 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/032af861-fde8-4b7a-929b-2ec7f5871474-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.152184 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-846sn\" (UniqueName: \"kubernetes.io/projected/7296e8da-30a1-4c69-978f-3411bda327f7-kube-api-access-846sn\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.161156 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9sz8p"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.168290 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerName="galera" containerID="cri-o://9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5" gracePeriod=30 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.183914 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.186417 4958 scope.go:117] "RemoveContainer" containerID="d6ab967c227e3d5e9e63f9b31bc7367d580da34af75c1d75d3010bf7ca6e1148" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.191194 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6600-account-create-48jzj"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.213121 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.213509 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8e18a1be-f832-499c-a518-50dad90cafc9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c" gracePeriod=30 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.246252 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea8f6-account-delete-cjlz4"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.254260 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.272705 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7296e8da-30a1-4c69-978f-3411bda327f7" (UID: "7296e8da-30a1-4c69-978f-3411bda327f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.301655 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrona095-account-delete-wwt74"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.314195 4958 scope.go:117] "RemoveContainer" containerID="892484689c2af5ec876095062b833d60c2fdb08de4c3c04e7eb3ecefcffbab27" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.326861 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.340284 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a8f6-account-create-5r7c8"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.357475 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.357529 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.360600 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.415353 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "032af861-fde8-4b7a-929b-2ec7f5871474" (UID: "032af861-fde8-4b7a-929b-2ec7f5871474"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.420673 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9ccsx"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.459804 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.464014 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/032af861-fde8-4b7a-929b-2ec7f5871474-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.484328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7296e8da-30a1-4c69-978f-3411bda327f7" (UID: "7296e8da-30a1-4c69-978f-3411bda327f7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.494092 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xxjrm"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.539909 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9ccsx"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.572792 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7296e8da-30a1-4c69-978f-3411bda327f7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.584190 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a095-account-create-pkl5w"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.596712 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1865-account-create-gmczv"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.613825 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a8f6-account-create-5r7c8"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.630322 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a095-account-create-pkl5w"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.639749 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bnhb6"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.686769 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1865-account-create-gmczv"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.727836 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xxjrm"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.774356 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi1865-account-delete-hmcpq"] Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.809270 4958 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-vshrg\" not found" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.835779 4958 generic.go:334] "Generic (PLEG): container finished" podID="0353bee2-4033-4493-9217-b5c4600d3d90" containerID="7f2380f1568f2f9c175fd6ffac5bc9a4006ac5eca2dd568b473ca89bfac956f6" exitCode=143 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.844893 4958 generic.go:334] "Generic (PLEG): container finished" podID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerID="0e6afa27cac7b16ea18ffb082028d7d4a5d1af380112b0080ebda5f523c8d154" exitCode=143 Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.854520 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f40860-6a7c-4bb0-83a3-6f61c6da561c" path="/var/lib/kubelet/pods/13f40860-6a7c-4bb0-83a3-6f61c6da561c/volumes" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.854870 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6fscs_7296e8da-30a1-4c69-978f-3411bda327f7/openstack-network-exporter/0.log" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.855164 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6fscs" Dec 01 10:26:31 crc kubenswrapper[4958]: I1201 10:26:31.877406 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee9c21e-3f59-4544-b5b9-f91bfdb74aee" path="/var/lib/kubelet/pods/3ee9c21e-3f59-4544-b5b9-f91bfdb74aee/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:31.998759 4958 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:31.998828 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data podName:8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:32.498809183 +0000 UTC m=+1640.007598220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data") pod "nova-cell1-conductor-0" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6") : secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.004922 4958 generic.go:334] "Generic (PLEG): container finished" podID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerID="ca5f316669c8cd32825cef769b0e3e1c6eab56cc41d4a4cff5edd8cd75e0ad40" exitCode=143 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.010923 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814bf37e-29e7-410e-b300-4e1e14ad1d31" path="/var/lib/kubelet/pods/814bf37e-29e7-410e-b300-4e1e14ad1d31/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.011015 4958 generic.go:334] "Generic (PLEG): container finished" podID="3f7f1749-5fab-4187-8675-01747669c1b7" containerID="b8d3ef2ad4ede695d83613970dd3bc8a0788b09e8f0a4653b81852375759006f" exitCode=0 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.017739 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9324a903-818e-4d56-a4cc-2d8d68994c39" path="/var/lib/kubelet/pods/9324a903-818e-4d56-a4cc-2d8d68994c39/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.022868 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2bd3b82-a2cb-40ac-8a52-6016d62a4040" path="/var/lib/kubelet/pods/a2bd3b82-a2cb-40ac-8a52-6016d62a4040/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.024602 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c5a32d-3e78-4ea7-9cb2-ca200f02e692" path="/var/lib/kubelet/pods/d3c5a32d-3e78-4ea7-9cb2-ca200f02e692/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.026197 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fb3017-de2a-4213-b8ae-46dbc306cbe7" path="/var/lib/kubelet/pods/d3fb3017-de2a-4213-b8ae-46dbc306cbe7/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.026915 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5bc0e8-47e9-4229-87df-b605ccc638b1" path="/var/lib/kubelet/pods/db5bc0e8-47e9-4229-87df-b605ccc638b1/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.028147 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc87cf0c-982e-44d5-adc8-2e9827faa501" path="/var/lib/kubelet/pods/dc87cf0c-982e-44d5-adc8-2e9827faa501/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.030139 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2699dc-0b5f-40c4-a163-d73e30013069" path="/var/lib/kubelet/pods/de2699dc-0b5f-40c4-a163-d73e30013069/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.030743 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef" exitCode=0 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.030796 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9d3333-6e5c-4663-9b79-d8fd9a9974a8" path="/var/lib/kubelet/pods/de9d3333-6e5c-4663-9b79-d8fd9a9974a8/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.031488 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3fa6e71-f3b7-4b98-a760-10a4ae7ed049" path="/var/lib/kubelet/pods/e3fa6e71-f3b7-4b98-a760-10a4ae7ed049/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.033163 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6ba24c-b48f-4fac-b7e1-8acd44d44c82" path="/var/lib/kubelet/pods/ed6ba24c-b48f-4fac-b7e1-8acd44d44c82/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.034017 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31601d4-2e94-4471-9216-e0769fddfb89" path="/var/lib/kubelet/pods/f31601d4-2e94-4471-9216-e0769fddfb89/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.039375 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4231276-2ab4-4cfd-bb47-850bf407d477" path="/var/lib/kubelet/pods/f4231276-2ab4-4cfd-bb47-850bf407d477/volumes" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.043965 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bnhb6"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.044025 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrona095-account-delete-wwt74" event={"ID":"34129d60-4706-41fe-aa19-f2a13f38713a","Type":"ContainerStarted","Data":"b00a8d940cff610a1acd971604c1e00ecd082786945ae6a350a4ee8b6a4d3d34"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.044064 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0353bee2-4033-4493-9217-b5c4600d3d90","Type":"ContainerDied","Data":"7f2380f1568f2f9c175fd6ffac5bc9a4006ac5eca2dd568b473ca89bfac956f6"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.044100 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementea59-account-delete-jmj7f"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.045376 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-59ce-account-create-2skt7"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.045431 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85fc6f9f59-gdn47" event={"ID":"81c0931c-8919-4128-97bb-21c5872d5cf0","Type":"ContainerDied","Data":"0e6afa27cac7b16ea18ffb082028d7d4a5d1af380112b0080ebda5f523c8d154"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.058189 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-59ce-account-create-2skt7"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.060119 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.060168 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6fscs" event={"ID":"7296e8da-30a1-4c69-978f-3411bda327f7","Type":"ContainerDied","Data":"5cc5f608f30fbd2b227f47bf9a0a321b3c93c5fafd8963393a306b10caf4383b"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.060205 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14","Type":"ContainerDied","Data":"ca5f316669c8cd32825cef769b0e3e1c6eab56cc41d4a4cff5edd8cd75e0ad40"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.060225 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementea59-account-delete-jmj7f" event={"ID":"3f7f1749-5fab-4187-8675-01747669c1b7","Type":"ContainerDied","Data":"b8d3ef2ad4ede695d83613970dd3bc8a0788b09e8f0a4653b81852375759006f"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.060243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.060277 4958 scope.go:117] "RemoveContainer" containerID="d39d333c260ccb07db8a021dde9b4a9a746995835c4970968df5c849398cd2e8" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.076329 4958 generic.go:334] "Generic (PLEG): container finished" podID="ceca985e-bec2-42fe-9758-edad828586c2" containerID="d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96" exitCode=143 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.076530 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" event={"ID":"ceca985e-bec2-42fe-9758-edad828586c2","Type":"ContainerDied","Data":"d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.092255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1865-account-delete-hmcpq" event={"ID":"f20641f2-49c0-4492-8f1f-20b14a1f3bd3","Type":"ContainerStarted","Data":"007505ad5173055ec3b1c817f5aa6fa1d1dfd59efcea02f00bc475f796397b1b"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.099328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder259f-account-delete-zpd7t" event={"ID":"66469344-8c32-45d9-afc4-91dcb9dbe807","Type":"ContainerStarted","Data":"371761547cfda20e7086a76636693a3a4a8aaa766c985fa2cc253c8911e2c55f"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.099571 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder259f-account-delete-zpd7t" podUID="66469344-8c32-45d9-afc4-91dcb9dbe807" containerName="mariadb-account-delete" containerID="cri-o://e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281" gracePeriod=30 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.107597 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_032af861-fde8-4b7a-929b-2ec7f5871474/ovsdbserver-nb/0.log" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.107708 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"032af861-fde8-4b7a-929b-2ec7f5871474","Type":"ContainerDied","Data":"9342c765205105f9f18fc905ac3548f8db04b2aeda86e2163143ee2f096f04cd"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.107879 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.122294 4958 generic.go:334] "Generic (PLEG): container finished" podID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerID="2c69fba39bcc23fd51b0e44fe893c7a695b06edee1de7171ac930180b17f1ebe" exitCode=143 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.123203 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744fbbd578-7c5pc" event={"ID":"b727a711-6b0b-44c6-917a-602f10dd0d6c","Type":"ContainerDied","Data":"2c69fba39bcc23fd51b0e44fe893c7a695b06edee1de7171ac930180b17f1ebe"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.123391 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7tf7x"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.133586 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.152328 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7tf7x"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.152389 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7zxqp"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.158027 4958 generic.go:334] "Generic (PLEG): container finished" podID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" exitCode=0 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.158106 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerDied","Data":"d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.159423 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.159654 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="113fc18e-8eb6-45a5-9625-1404f4d832c4" containerName="nova-cell0-conductor-conductor" containerID="cri-o://973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936" gracePeriod=30 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.164377 4958 generic.go:334] "Generic (PLEG): container finished" podID="ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc" containerID="764cc2fdc960a743eeeb7feffbf676914a12612fce9cefcab0492d1b7b5f48dc" exitCode=0 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.164459 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5432-account-delete-z4wgz" event={"ID":"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc","Type":"ContainerDied","Data":"764cc2fdc960a743eeeb7feffbf676914a12612fce9cefcab0492d1b7b5f48dc"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.164486 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5432-account-delete-z4wgz" event={"ID":"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc","Type":"ContainerStarted","Data":"749a75f64dc8abbf50ed5eb4c4c2ade80d5327745172a326412b88a451d49e2e"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.174395 4958 generic.go:334] "Generic (PLEG): container finished" podID="79797aa2-db7b-429a-91d9-3e181de3976c" containerID="c5d92e349f45e4a0af02c0d2ddcebd7a3e9e9b1433ebbd079fd3f0a98c18c1b3" exitCode=143 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.174500 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79797aa2-db7b-429a-91d9-3e181de3976c","Type":"ContainerDied","Data":"c5d92e349f45e4a0af02c0d2ddcebd7a3e9e9b1433ebbd079fd3f0a98c18c1b3"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.176134 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7zxqp"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.183142 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea8f6-account-delete-cjlz4" event={"ID":"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b","Type":"ContainerStarted","Data":"48656496f6aa110cb2782f7efdedf6564ef983b71ec40162f3d15ff58a701633"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.196586 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5432-account-delete-z4wgz"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.197513 4958 generic.go:334] "Generic (PLEG): container finished" podID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerID="41cd43dae598fc0fb9dde4e0161ce7d640d805585cd73ef9e8521ddf5b485506" exitCode=0 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.197587 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa361886-e7eb-413c-a4b4-cedaf1c86983","Type":"ContainerDied","Data":"41cd43dae598fc0fb9dde4e0161ce7d640d805585cd73ef9e8521ddf5b485506"} Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.209567 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.209932 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="de0d4422-f9bb-4abf-8727-066813a9182e" containerName="nova-scheduler-scheduler" containerID="cri-o://058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" gracePeriod=30 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.246071 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder259f-account-delete-zpd7t"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.251946 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerName="rabbitmq" containerID="cri-o://79456c806f5f410ac0967e765960f7c80fe55e590cd1656e2879adefde292303" gracePeriod=604800 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.263066 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4p4kj"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.273474 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4p4kj"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.303095 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea8f6-account-delete-cjlz4"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.335948 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-8d6888456-hv67t" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.152:8778/\": read tcp 10.217.0.2:46610->10.217.0.152:8778: read: connection reset by peer" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.336048 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-8d6888456-hv67t" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.152:8778/\": read tcp 10.217.0.2:46622->10.217.0.152:8778: read: connection reset by peer" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.341110 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.385427 4958 scope.go:117] "RemoveContainer" containerID="7785733222ee379e13b2e08924fc0b23055d37143a5248373e74dc7a5b530c74" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.447945 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.451584 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrona095-account-delete-wwt74"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.459130 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi1865-account-delete-hmcpq"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.467677 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5ffd4b498c-jxtxd"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.468074 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-httpd" containerID="cri-o://912676adc0639820ea8a098452e8e4b85618ac2b2c6eda07e38c9a1e111eefc8" gracePeriod=30 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.468709 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-server" containerID="cri-o://cfeb39b686f44466325e4103c9715785337ae56a4d6010ba306c05d7d1433916" gracePeriod=30 Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.480137 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6fscs"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.480282 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder259f-account-delete-zpd7t" podStartSLOduration=5.480261948 podStartE2EDuration="5.480261948s" podCreationTimestamp="2025-12-01 10:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:26:32.121523272 +0000 UTC m=+1639.630312319" watchObservedRunningTime="2025-12-01 10:26:32.480261948 +0000 UTC m=+1639.989050985" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.483015 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6fscs"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.512030 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.536236 4958 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.536329 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data podName:8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:33.536307119 +0000 UTC m=+1641.045096146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data") pod "nova-cell1-conductor-0" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6") : secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.568914 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.670650 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.717954 4958 scope.go:117] "RemoveContainer" containerID="9f86a4fe20ec5f769cd7eb9953f1018672b10ce2765e2dc5eb42d2f7a1374f18" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.740013 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv924\" (UniqueName: \"kubernetes.io/projected/3f7f1749-5fab-4187-8675-01747669c1b7-kube-api-access-nv924\") pod \"3f7f1749-5fab-4187-8675-01747669c1b7\" (UID: \"3f7f1749-5fab-4187-8675-01747669c1b7\") " Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.757528 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7f1749-5fab-4187-8675-01747669c1b7-kube-api-access-nv924" (OuterVolumeSpecName: "kube-api-access-nv924") pod "3f7f1749-5fab-4187-8675-01747669c1b7" (UID: "3f7f1749-5fab-4187-8675-01747669c1b7"). InnerVolumeSpecName "kube-api-access-nv924". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.779669 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.787169 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.789815 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.789916 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.806302 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.842553 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpnwt\" (UniqueName: \"kubernetes.io/projected/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc-kube-api-access-gpnwt\") pod \"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc\" (UID: \"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc\") " Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.843514 4958 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.843575 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv924\" (UniqueName: \"kubernetes.io/projected/3f7f1749-5fab-4187-8675-01747669c1b7-kube-api-access-nv924\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.843633 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:36.843606906 +0000 UTC m=+1644.352395943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-scripts" not found Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.843645 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.843680 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data podName:4fccd607-3bfb-4593-a6de-6a0fc52b34ea nodeName:}" failed. No retries permitted until 2025-12-01 10:26:36.843670898 +0000 UTC m=+1644.352459935 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data") pod "rabbitmq-server-0" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea") : configmap "rabbitmq-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.844765 4958 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: E1201 10:26:32.844953 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:36.844836061 +0000 UTC m=+1644.353625098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-default-external-config-data" not found Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.863679 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc-kube-api-access-gpnwt" (OuterVolumeSpecName: "kube-api-access-gpnwt") pod "ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc" (UID: "ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc"). InnerVolumeSpecName "kube-api-access-gpnwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:32 crc kubenswrapper[4958]: I1201 10:26:32.951103 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpnwt\" (UniqueName: \"kubernetes.io/projected/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc-kube-api-access-gpnwt\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.133993 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.181989 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.199529 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.224998 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7zch\" (UniqueName: \"kubernetes.io/projected/66469344-8c32-45d9-afc4-91dcb9dbe807-kube-api-access-q7zch\") pod \"66469344-8c32-45d9-afc4-91dcb9dbe807\" (UID: \"66469344-8c32-45d9-afc4-91dcb9dbe807\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.240738 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.243184 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66469344-8c32-45d9-afc4-91dcb9dbe807-kube-api-access-q7zch" (OuterVolumeSpecName: "kube-api-access-q7zch") pod "66469344-8c32-45d9-afc4-91dcb9dbe807" (UID: "66469344-8c32-45d9-afc4-91dcb9dbe807"). InnerVolumeSpecName "kube-api-access-q7zch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.243434 4958 generic.go:334] "Generic (PLEG): container finished" podID="8e18a1be-f832-499c-a518-50dad90cafc9" containerID="322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.243516 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e18a1be-f832-499c-a518-50dad90cafc9","Type":"ContainerDied","Data":"322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.243557 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e18a1be-f832-499c-a518-50dad90cafc9","Type":"ContainerDied","Data":"d2fadc4de066f6fb651416ad9dd57ef198ecee3855c4c69a982a18945645906c"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.243583 4958 scope.go:117] "RemoveContainer" containerID="322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.246152 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.259891 4958 generic.go:334] "Generic (PLEG): container finished" podID="f20641f2-49c0-4492-8f1f-20b14a1f3bd3" containerID="8168f6cbe0bc798da68a2e2d2fe7de44eece7fb512c444ab5a26c173069687de" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.260018 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1865-account-delete-hmcpq" event={"ID":"f20641f2-49c0-4492-8f1f-20b14a1f3bd3","Type":"ContainerDied","Data":"8168f6cbe0bc798da68a2e2d2fe7de44eece7fb512c444ab5a26c173069687de"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.298374 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.162:8776/healthcheck\": read tcp 10.217.0.2:50104->10.217.0.162:8776: read: connection reset by peer" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328328 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrqpx\" (UniqueName: \"kubernetes.io/projected/8e18a1be-f832-499c-a518-50dad90cafc9-kube-api-access-vrqpx\") pod \"8e18a1be-f832-499c-a518-50dad90cafc9\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328394 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328450 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kolla-config\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328478 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-combined-ca-bundle\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328516 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-config-data\") pod \"8e18a1be-f832-499c-a518-50dad90cafc9\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328566 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-galera-tls-certs\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328633 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqmxf\" (UniqueName: \"kubernetes.io/projected/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b-kube-api-access-mqmxf\") pod \"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b\" (UID: \"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328704 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-generated\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328741 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-combined-ca-bundle\") pod \"8e18a1be-f832-499c-a518-50dad90cafc9\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328767 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-default\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.328795 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-operator-scripts\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.329053 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-vencrypt-tls-certs\") pod \"8e18a1be-f832-499c-a518-50dad90cafc9\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.329109 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-nova-novncproxy-tls-certs\") pod \"8e18a1be-f832-499c-a518-50dad90cafc9\" (UID: \"8e18a1be-f832-499c-a518-50dad90cafc9\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.329225 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-secrets\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.329327 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvf6z\" (UniqueName: \"kubernetes.io/projected/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kube-api-access-cvf6z\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.330442 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7zch\" (UniqueName: \"kubernetes.io/projected/66469344-8c32-45d9-afc4-91dcb9dbe807-kube-api-access-q7zch\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.332638 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.333156 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.334824 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.335872 4958 generic.go:334] "Generic (PLEG): container finished" podID="207148af-4b76-49b6-80cc-883ec14bb268" containerID="944f885a93b277fd5b446c26c7b9c7db505641ce99d66fc0eee7275265da1587" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.336017 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6888456-hv67t" event={"ID":"207148af-4b76-49b6-80cc-883ec14bb268","Type":"ContainerDied","Data":"944f885a93b277fd5b446c26c7b9c7db505641ce99d66fc0eee7275265da1587"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.339155 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.341178 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kube-api-access-cvf6z" (OuterVolumeSpecName: "kube-api-access-cvf6z") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "kube-api-access-cvf6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.341247 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e18a1be-f832-499c-a518-50dad90cafc9-kube-api-access-vrqpx" (OuterVolumeSpecName: "kube-api-access-vrqpx") pod "8e18a1be-f832-499c-a518-50dad90cafc9" (UID: "8e18a1be-f832-499c-a518-50dad90cafc9"). InnerVolumeSpecName "kube-api-access-vrqpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.345701 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b-kube-api-access-mqmxf" (OuterVolumeSpecName: "kube-api-access-mqmxf") pod "adbb9949-a0ed-4f1a-805c-fedba7ec3f1b" (UID: "adbb9949-a0ed-4f1a-805c-fedba7ec3f1b"). InnerVolumeSpecName "kube-api-access-mqmxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.356801 4958 scope.go:117] "RemoveContainer" containerID="322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.365550 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-secrets" (OuterVolumeSpecName: "secrets") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.365610 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c\": container with ID starting with 322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c not found: ID does not exist" containerID="322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.366802 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementea59-account-delete-jmj7f" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.366904 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c"} err="failed to get container status \"322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c\": rpc error: code = NotFound desc = could not find container \"322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c\": container with ID starting with 322b91b947af356def1df34f9b0bd833e4b9e520e2557a5215e4e51916c5373c not found: ID does not exist" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.366710 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementea59-account-delete-jmj7f" event={"ID":"3f7f1749-5fab-4187-8675-01747669c1b7","Type":"ContainerDied","Data":"b0be9331d12d466ad9a8f3a50d26755777ca7dcfceaefdf2e3635069dd4f2cf5"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.367114 4958 scope.go:117] "RemoveContainer" containerID="b8d3ef2ad4ede695d83613970dd3bc8a0788b09e8f0a4653b81852375759006f" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.402588 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican5432-account-delete-z4wgz" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.403173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican5432-account-delete-z4wgz" event={"ID":"ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc","Type":"ContainerDied","Data":"749a75f64dc8abbf50ed5eb4c4c2ade80d5327745172a326412b88a451d49e2e"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.409985 4958 generic.go:334] "Generic (PLEG): container finished" podID="adbb9949-a0ed-4f1a-805c-fedba7ec3f1b" containerID="4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.410117 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea8f6-account-delete-cjlz4" event={"ID":"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b","Type":"ContainerDied","Data":"48656496f6aa110cb2782f7efdedf6564ef983b71ec40162f3d15ff58a701633"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.410157 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea8f6-account-delete-cjlz4" event={"ID":"adbb9949-a0ed-4f1a-805c-fedba7ec3f1b","Type":"ContainerDied","Data":"4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.410238 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancea8f6-account-delete-cjlz4" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.428623 4958 generic.go:334] "Generic (PLEG): container finished" podID="66469344-8c32-45d9-afc4-91dcb9dbe807" containerID="e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.429059 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder259f-account-delete-zpd7t" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.429077 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder259f-account-delete-zpd7t" event={"ID":"66469344-8c32-45d9-afc4-91dcb9dbe807","Type":"ContainerDied","Data":"371761547cfda20e7086a76636693a3a4a8aaa766c985fa2cc253c8911e2c55f"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.429326 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder259f-account-delete-zpd7t" event={"ID":"66469344-8c32-45d9-afc4-91dcb9dbe807","Type":"ContainerDied","Data":"e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.431225 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.432266 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\" (UID: \"bed7f6be-9254-406b-9ed4-3fff3b2eb531\") " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433403 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrqpx\" (UniqueName: \"kubernetes.io/projected/8e18a1be-f832-499c-a518-50dad90cafc9-kube-api-access-vrqpx\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433444 4958 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433459 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqmxf\" (UniqueName: \"kubernetes.io/projected/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b-kube-api-access-mqmxf\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433471 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433486 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433497 4958 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed7f6be-9254-406b-9ed4-3fff3b2eb531-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433508 4958 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433520 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvf6z\" (UniqueName: \"kubernetes.io/projected/bed7f6be-9254-406b-9ed4-3fff3b2eb531-kube-api-access-cvf6z\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: W1201 10:26:33.433601 4958 mount_helper_common.go:34] Warning: mount cleanup skipped because path does not exist: /var/lib/kubelet/pods/bed7f6be-9254-406b-9ed4-3fff3b2eb531/volumes/kubernetes.io~local-volume/local-storage01-crc Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.433616 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.442725 4958 generic.go:334] "Generic (PLEG): container finished" podID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerID="cfeb39b686f44466325e4103c9715785337ae56a4d6010ba306c05d7d1433916" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.443009 4958 generic.go:334] "Generic (PLEG): container finished" podID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerID="912676adc0639820ea8a098452e8e4b85618ac2b2c6eda07e38c9a1e111eefc8" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.442799 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" event={"ID":"d93b36ff-312b-40e1-9e3a-b1981800da66","Type":"ContainerDied","Data":"cfeb39b686f44466325e4103c9715785337ae56a4d6010ba306c05d7d1433916"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.443201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" event={"ID":"d93b36ff-312b-40e1-9e3a-b1981800da66","Type":"ContainerDied","Data":"912676adc0639820ea8a098452e8e4b85618ac2b2c6eda07e38c9a1e111eefc8"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.445654 4958 generic.go:334] "Generic (PLEG): container finished" podID="34129d60-4706-41fe-aa19-f2a13f38713a" containerID="b2c2ce181a7e61ce78834ea943cac16253a105ecdd868af50896db054a33e2cb" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.445709 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrona095-account-delete-wwt74" event={"ID":"34129d60-4706-41fe-aa19-f2a13f38713a","Type":"ContainerDied","Data":"b2c2ce181a7e61ce78834ea943cac16253a105ecdd868af50896db054a33e2cb"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.501016 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-config-data" (OuterVolumeSpecName: "config-data") pod "8e18a1be-f832-499c-a518-50dad90cafc9" (UID: "8e18a1be-f832-499c-a518-50dad90cafc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.515596 4958 generic.go:334] "Generic (PLEG): container finished" podID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerID="9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5" exitCode=0 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.515953 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" containerName="nova-cell1-conductor-conductor" containerID="cri-o://b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b" gracePeriod=30 Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.515978 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bed7f6be-9254-406b-9ed4-3fff3b2eb531","Type":"ContainerDied","Data":"9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.516088 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bed7f6be-9254-406b-9ed4-3fff3b2eb531","Type":"ContainerDied","Data":"7ec911488c4a87ce3d94ef54cc404b05a46dcde3559bf6c90f6dd81d8a7b1ee2"} Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.516234 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.535795 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.535837 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.546774 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.575331 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "8e18a1be-f832-499c-a518-50dad90cafc9" (UID: "8e18a1be-f832-499c-a518-50dad90cafc9"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.575463 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e18a1be-f832-499c-a518-50dad90cafc9" (UID: "8e18a1be-f832-499c-a518-50dad90cafc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.575526 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bed7f6be-9254-406b-9ed4-3fff3b2eb531" (UID: "bed7f6be-9254-406b-9ed4-3fff3b2eb531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.577296 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.679483 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.679530 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.679561 4958 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bed7f6be-9254-406b-9ed4-3fff3b2eb531-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.679574 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.679587 4958 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.680150 4958 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.680220 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data podName:8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:35.680202203 +0000 UTC m=+1643.188991240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data") pod "nova-cell1-conductor-0" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6") : secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.692818 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "8e18a1be-f832-499c-a518-50dad90cafc9" (UID: "8e18a1be-f832-499c-a518-50dad90cafc9"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.722170 4958 scope.go:117] "RemoveContainer" containerID="764cc2fdc960a743eeeb7feffbf676914a12612fce9cefcab0492d1b7b5f48dc" Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.753514 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.757315 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.759567 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.759797 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" containerName="nova-cell1-conductor-conductor" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.782947 4958 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e18a1be-f832-499c-a518-50dad90cafc9-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.867599 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" path="/var/lib/kubelet/pods/032af861-fde8-4b7a-929b-2ec7f5871474/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.868737 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c5b092-3de8-439d-a3aa-8aabc962d87a" path="/var/lib/kubelet/pods/58c5b092-3de8-439d-a3aa-8aabc962d87a/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.869571 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9" path="/var/lib/kubelet/pods/6b00e0f3-3bdd-41db-9f80-a0680d1c2dd9/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.871194 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7296e8da-30a1-4c69-978f-3411bda327f7" path="/var/lib/kubelet/pods/7296e8da-30a1-4c69-978f-3411bda327f7/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.872347 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1110df-28ad-4b93-ad3b-54d771229959" path="/var/lib/kubelet/pods/7e1110df-28ad-4b93-ad3b-54d771229959/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.873083 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5588e33-3bf3-4ade-99d6-f0f5c26c62b5" path="/var/lib/kubelet/pods/a5588e33-3bf3-4ade-99d6-f0f5c26c62b5/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.876598 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d577a45a-466a-4ea0-925f-37f3946eea80" path="/var/lib/kubelet/pods/d577a45a-466a-4ea0-925f-37f3946eea80/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.877623 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" path="/var/lib/kubelet/pods/debf8fae-f15c-4b09-b185-3f47c7e0491b/volumes" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.931681 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-744fbbd578-7c5pc" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:37464->10.217.0.157:9311: read: connection reset by peer" Dec 01 10:26:33 crc kubenswrapper[4958]: I1201 10:26:33.931773 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-744fbbd578-7c5pc" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:37474->10.217.0.157:9311: read: connection reset by peer" Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.966911 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.982305 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.984559 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 10:26:33 crc kubenswrapper[4958]: E1201 10:26:33.984736 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="113fc18e-8eb6-45a5-9625-1404f4d832c4" containerName="nova-cell0-conductor-conductor" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.010391 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.011684 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-central-agent" containerID="cri-o://3fbf7d32df1fc1b787081dc2dd704d72d22c756d3674cb93e42c1d2472d4bef9" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.012456 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="sg-core" containerID="cri-o://f54faead50baed98cc76dab5ac5bc05271f5999149758dd0bd4a4895989a1b1b" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.012513 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-notification-agent" containerID="cri-o://a6020adfcd7117df2c055f1f046c4057d506ced6a9066aeab5c10fdf00691823" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.012470 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="proxy-httpd" containerID="cri-o://d2b247b18d958afcd723731638919280c2072270eb4c5c41496ef47a22a4f23a" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.061662 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.061989 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" containerName="kube-state-metrics" containerID="cri-o://cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.222162 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.229389 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="03af271e-0af4-4681-a7b6-31b207d21143" containerName="memcached" containerID="cri-o://4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.530421 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ms7jr"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.540143 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.540295 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-q4ccp"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.541412 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.575698 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.576193 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.576317 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.581898 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5f6877488-9nzb2"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.582285 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5f6877488-9nzb2" podUID="caf6f35f-9e82-4899-801f-3b5e94189f6d" containerName="keystone-api" containerID="cri-o://f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9" gracePeriod=30 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.592090 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ms7jr"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.596799 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.601947 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1865-account-delete-hmcpq" event={"ID":"f20641f2-49c0-4492-8f1f-20b14a1f3bd3","Type":"ContainerDied","Data":"007505ad5173055ec3b1c817f5aa6fa1d1dfd59efcea02f00bc475f796397b1b"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.602006 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="007505ad5173055ec3b1c817f5aa6fa1d1dfd59efcea02f00bc475f796397b1b" Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.602052 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.602099 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.606402 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-q4ccp"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.607927 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-d9qck" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" probeResult="failure" output="command timed out" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.609705 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d6888456-hv67t" event={"ID":"207148af-4b76-49b6-80cc-883ec14bb268","Type":"ContainerDied","Data":"e8b4102e8e9184f8886030f4534753e5545e5fb8d5a2bf64b552ec540db373a3"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.609931 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b4102e8e9184f8886030f4534753e5545e5fb8d5a2bf64b552ec540db373a3" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.613139 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" event={"ID":"d93b36ff-312b-40e1-9e3a-b1981800da66","Type":"ContainerDied","Data":"ffd94f974b28f659374220632a6bea30500f988fba858f8cb32a3bb61aefd56b"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.613257 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd94f974b28f659374220632a6bea30500f988fba858f8cb32a3bb61aefd56b" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.615482 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.684655 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sx6jg"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.686494 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:37818->10.217.0.200:8775: read: connection reset by peer" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.686621 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:37810->10.217.0.200:8775: read: connection reset by peer" Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.707981 4958 generic.go:334] "Generic (PLEG): container finished" podID="137d864e-34d9-452c-91b0-179a93198b0f" containerID="65331563d699edd62e898ec079b7208d63c4336ed333b4a2cad00c932ee16ea5" exitCode=0 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.708167 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sx6jg"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.709297 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"137d864e-34d9-452c-91b0-179a93198b0f","Type":"ContainerDied","Data":"65331563d699edd62e898ec079b7208d63c4336ed333b4a2cad00c932ee16ea5"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.716475 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-d9qck" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" probeResult="failure" output=< Dec 01 10:26:34 crc kubenswrapper[4958]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Dec 01 10:26:34 crc kubenswrapper[4958]: > Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.729699 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-99a7-account-create-c9sr8"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.738337 4958 generic.go:334] "Generic (PLEG): container finished" podID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerID="d2b247b18d958afcd723731638919280c2072270eb4c5c41496ef47a22a4f23a" exitCode=0 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.738393 4958 generic.go:334] "Generic (PLEG): container finished" podID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerID="f54faead50baed98cc76dab5ac5bc05271f5999149758dd0bd4a4895989a1b1b" exitCode=2 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.738516 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerDied","Data":"d2b247b18d958afcd723731638919280c2072270eb4c5c41496ef47a22a4f23a"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.738559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerDied","Data":"f54faead50baed98cc76dab5ac5bc05271f5999149758dd0bd4a4895989a1b1b"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.747139 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-99a7-account-create-c9sr8"] Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.798504 4958 generic.go:334] "Generic (PLEG): container finished" podID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerID="9144cb56a423613b84457fe40b192f906f9bb58de4d5fb7b219e438f9744439a" exitCode=0 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.798661 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30e8c723-a7a0-4697-8369-bd224fcfdf3f","Type":"ContainerDied","Data":"9144cb56a423613b84457fe40b192f906f9bb58de4d5fb7b219e438f9744439a"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.814118 4958 generic.go:334] "Generic (PLEG): container finished" podID="0353bee2-4033-4493-9217-b5c4600d3d90" containerID="2f99a0981f6f8c18ad3b38420c231b904981b9a55694320467486987165f628d" exitCode=0 Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.814254 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0353bee2-4033-4493-9217-b5c4600d3d90","Type":"ContainerDied","Data":"2f99a0981f6f8c18ad3b38420c231b904981b9a55694320467486987165f628d"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.836950 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744fbbd578-7c5pc" event={"ID":"b727a711-6b0b-44c6-917a-602f10dd0d6c","Type":"ContainerDied","Data":"8d7386e6d905cc7e682cbc9fcbfef1b78be385a0409c687cf999a6495aba6952"} Dec 01 10:26:34 crc kubenswrapper[4958]: I1201 10:26:34.818407 4958 generic.go:334] "Generic (PLEG): container finished" podID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerID="8d7386e6d905cc7e682cbc9fcbfef1b78be385a0409c687cf999a6495aba6952" exitCode=0 Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.905798 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:34 crc kubenswrapper[4958]: E1201 10:26:34.906013 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data podName:e19ffea8-2e96-4cff-a2ec-40646aaa4cc0 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:42.90598769 +0000 UTC m=+1650.414776727 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data") pod "rabbitmq-cell1-server-0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0") : configmap "rabbitmq-cell1-config-data" not found Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.004908 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="46e6b589-937c-42c8-8004-49e39813d622" containerName="galera" containerID="cri-o://2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc" gracePeriod=30 Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.437357 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.448722 4958 scope.go:117] "RemoveContainer" containerID="4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.452978 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementea59-account-delete-jmj7f"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.471662 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.480800 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementea59-account-delete-jmj7f"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.506326 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.518956 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-public-tls-certs\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.519053 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-internal-tls-certs\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.519249 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-config-data\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.519340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n8gv\" (UniqueName: \"kubernetes.io/projected/207148af-4b76-49b6-80cc-883ec14bb268-kube-api-access-5n8gv\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.526895 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/207148af-4b76-49b6-80cc-883ec14bb268-logs\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.527235 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-combined-ca-bundle\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.527275 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-scripts\") pod \"207148af-4b76-49b6-80cc-883ec14bb268\" (UID: \"207148af-4b76-49b6-80cc-883ec14bb268\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.528198 4958 scope.go:117] "RemoveContainer" containerID="4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.529425 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/207148af-4b76-49b6-80cc-883ec14bb268-logs" (OuterVolumeSpecName: "logs") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.539761 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/207148af-4b76-49b6-80cc-883ec14bb268-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.539831 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican5432-account-delete-z4wgz"] Dec 01 10:26:35 crc kubenswrapper[4958]: E1201 10:26:35.559988 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d\": container with ID starting with 4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d not found: ID does not exist" containerID="4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.560089 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d"} err="failed to get container status \"4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d\": rpc error: code = NotFound desc = could not find container \"4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d\": container with ID starting with 4dad046e14cd1c8a85d6a005a368c845f89f2c1843d9cad3a83a3db85fea2c3d not found: ID does not exist" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.560158 4958 scope.go:117] "RemoveContainer" containerID="e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.579486 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.584529 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.594173 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican5432-account-delete-z4wgz"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.597142 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/207148af-4b76-49b6-80cc-883ec14bb268-kube-api-access-5n8gv" (OuterVolumeSpecName: "kube-api-access-5n8gv") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "kube-api-access-5n8gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.648677 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-scripts" (OuterVolumeSpecName: "scripts") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.649754 4958 scope.go:117] "RemoveContainer" containerID="e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651458 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-public-tls-certs\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651491 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-etc-swift\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651522 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85nf7\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-kube-api-access-85nf7\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651576 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-run-httpd\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651593 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pszxx\" (UniqueName: \"kubernetes.io/projected/f20641f2-49c0-4492-8f1f-20b14a1f3bd3-kube-api-access-pszxx\") pod \"f20641f2-49c0-4492-8f1f-20b14a1f3bd3\" (UID: \"f20641f2-49c0-4492-8f1f-20b14a1f3bd3\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651638 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-log-httpd\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651657 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-internal-tls-certs\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651695 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-combined-ca-bundle\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.651750 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-config-data\") pod \"d93b36ff-312b-40e1-9e3a-b1981800da66\" (UID: \"d93b36ff-312b-40e1-9e3a-b1981800da66\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.652236 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.652251 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n8gv\" (UniqueName: \"kubernetes.io/projected/207148af-4b76-49b6-80cc-883ec14bb268-kube-api-access-5n8gv\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.652754 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: E1201 10:26:35.666770 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281\": container with ID starting with e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281 not found: ID does not exist" containerID="e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.667802 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.668205 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.670137 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281"} err="failed to get container status \"e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281\": rpc error: code = NotFound desc = could not find container \"e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281\": container with ID starting with e70c3af15c6bada5b3bce03e58f4a32bd4b1f859a087d9df3b61fede6d64a281 not found: ID does not exist" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.670187 4958 scope.go:117] "RemoveContainer" containerID="9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.676045 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.680727 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.691055 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.709115 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea8f6-account-delete-cjlz4"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.719578 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.727990 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancea8f6-account-delete-cjlz4"] Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.729024 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20641f2-49c0-4492-8f1f-20b14a1f3bd3-kube-api-access-pszxx" (OuterVolumeSpecName: "kube-api-access-pszxx") pod "f20641f2-49c0-4492-8f1f-20b14a1f3bd3" (UID: "f20641f2-49c0-4492-8f1f-20b14a1f3bd3"). InnerVolumeSpecName "kube-api-access-pszxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.763826 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-combined-ca-bundle\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.763988 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data-custom\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.764062 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.764336 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-public-tls-certs\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.764488 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-scripts\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.764586 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-internal-tls-certs\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.764676 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6xkm\" (UniqueName: \"kubernetes.io/projected/34129d60-4706-41fe-aa19-f2a13f38713a-kube-api-access-w6xkm\") pod \"34129d60-4706-41fe-aa19-f2a13f38713a\" (UID: \"34129d60-4706-41fe-aa19-f2a13f38713a\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.765266 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/137d864e-34d9-452c-91b0-179a93198b0f-etc-machine-id\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.765396 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tktm\" (UniqueName: \"kubernetes.io/projected/137d864e-34d9-452c-91b0-179a93198b0f-kube-api-access-4tktm\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.765494 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/137d864e-34d9-452c-91b0-179a93198b0f-logs\") pod \"137d864e-34d9-452c-91b0-179a93198b0f\" (UID: \"137d864e-34d9-452c-91b0-179a93198b0f\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.767604 4958 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.767657 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.767674 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pszxx\" (UniqueName: \"kubernetes.io/projected/f20641f2-49c0-4492-8f1f-20b14a1f3bd3-kube-api-access-pszxx\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.767689 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d93b36ff-312b-40e1-9e3a-b1981800da66-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: E1201 10:26:35.767907 4958 secret.go:188] Couldn't get secret openstack/nova-cell1-conductor-config-data: secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:35 crc kubenswrapper[4958]: E1201 10:26:35.768005 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data podName:8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6 nodeName:}" failed. No retries permitted until 2025-12-01 10:26:39.767975877 +0000 UTC m=+1647.276764914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data") pod "nova-cell1-conductor-0" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6") : secret "nova-cell1-conductor-config-data" not found Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.769300 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-kube-api-access-85nf7" (OuterVolumeSpecName: "kube-api-access-85nf7") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "kube-api-access-85nf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: E1201 10:26:35.797515 4958 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.806528 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34129d60-4706-41fe-aa19-f2a13f38713a-kube-api-access-w6xkm" (OuterVolumeSpecName: "kube-api-access-w6xkm") pod "34129d60-4706-41fe-aa19-f2a13f38713a" (UID: "34129d60-4706-41fe-aa19-f2a13f38713a"). InnerVolumeSpecName "kube-api-access-w6xkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.812555 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/137d864e-34d9-452c-91b0-179a93198b0f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.825930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/137d864e-34d9-452c-91b0-179a93198b0f-logs" (OuterVolumeSpecName: "logs") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.860333 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.870506 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.870587 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-certs\") pod \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.870643 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-combined-ca-bundle\") pod \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.870674 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-internal-tls-certs\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.870730 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-httpd-run\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.871984 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz5rb\" (UniqueName: \"kubernetes.io/projected/0353bee2-4033-4493-9217-b5c4600d3d90-kube-api-access-cz5rb\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.872034 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-config-data\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.872083 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-combined-ca-bundle\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.872110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-scripts\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.872222 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-logs\") pod \"0353bee2-4033-4493-9217-b5c4600d3d90\" (UID: \"0353bee2-4033-4493-9217-b5c4600d3d90\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.872247 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-config\") pod \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.872285 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2jpl\" (UniqueName: \"kubernetes.io/projected/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-api-access-k2jpl\") pod \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\" (UID: \"a4ce268c-ffb9-4eae-93e5-23a10ba96185\") " Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.873533 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/137d864e-34d9-452c-91b0-179a93198b0f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.873563 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/137d864e-34d9-452c-91b0-179a93198b0f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.873578 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.873589 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85nf7\" (UniqueName: \"kubernetes.io/projected/d93b36ff-312b-40e1-9e3a-b1981800da66-kube-api-access-85nf7\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.873603 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6xkm\" (UniqueName: \"kubernetes.io/projected/34129d60-4706-41fe-aa19-f2a13f38713a-kube-api-access-w6xkm\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.878703 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311e4fde-9406-4820-bb28-3988a095f565" path="/var/lib/kubelet/pods/311e4fde-9406-4820-bb28-3988a095f565/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.879474 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7f1749-5fab-4187-8675-01747669c1b7" path="/var/lib/kubelet/pods/3f7f1749-5fab-4187-8675-01747669c1b7/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.880161 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3f553a-d19b-494d-a009-18e2ba2aa110" path="/var/lib/kubelet/pods/6c3f553a-d19b-494d-a009-18e2ba2aa110/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.880905 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2854e0-7961-48f8-adb0-2118d1685a38" path="/var/lib/kubelet/pods/6d2854e0-7961-48f8-adb0-2118d1685a38/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.882388 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e18a1be-f832-499c-a518-50dad90cafc9" path="/var/lib/kubelet/pods/8e18a1be-f832-499c-a518-50dad90cafc9/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.883163 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbb9949-a0ed-4f1a-805c-fedba7ec3f1b" path="/var/lib/kubelet/pods/adbb9949-a0ed-4f1a-805c-fedba7ec3f1b/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.884013 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-logs" (OuterVolumeSpecName: "logs") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.884692 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc" path="/var/lib/kubelet/pods/ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.885901 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e0c8a9-ee98-44c8-95b2-9595ea834b9f" path="/var/lib/kubelet/pods/f5e0c8a9-ee98-44c8-95b2-9595ea834b9f/volumes" Dec 01 10:26:35 crc kubenswrapper[4958]: I1201 10:26:35.907920 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.157038 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.157080 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0353bee2-4033-4493-9217-b5c4600d3d90-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.202207 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-scripts" (OuterVolumeSpecName: "scripts") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.202561 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0353bee2-4033-4493-9217-b5c4600d3d90-kube-api-access-cz5rb" (OuterVolumeSpecName: "kube-api-access-cz5rb") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "kube-api-access-cz5rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.211726 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-api-access-k2jpl" (OuterVolumeSpecName: "kube-api-access-k2jpl") pod "a4ce268c-ffb9-4eae-93e5-23a10ba96185" (UID: "a4ce268c-ffb9-4eae-93e5-23a10ba96185"). InnerVolumeSpecName "kube-api-access-k2jpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.216159 4958 generic.go:334] "Generic (PLEG): container finished" podID="113fc18e-8eb6-45a5-9625-1404f4d832c4" containerID="973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936" exitCode=0 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.217258 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.225655 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137d864e-34d9-452c-91b0-179a93198b0f-kube-api-access-4tktm" (OuterVolumeSpecName: "kube-api-access-4tktm") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "kube-api-access-4tktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.230684 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.236873 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutrona095-account-delete-wwt74" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.240768 4958 generic.go:334] "Generic (PLEG): container finished" podID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerID="3fbf7d32df1fc1b787081dc2dd704d72d22c756d3674cb93e42c1d2472d4bef9" exitCode=0 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.249325 4958 generic.go:334] "Generic (PLEG): container finished" podID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" containerID="cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa" exitCode=2 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.249443 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.253797 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.259182 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz5rb\" (UniqueName: \"kubernetes.io/projected/0353bee2-4033-4493-9217-b5c4600d3d90-kube-api-access-cz5rb\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.259212 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.259224 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tktm\" (UniqueName: \"kubernetes.io/projected/137d864e-34d9-452c-91b0-179a93198b0f-kube-api-access-4tktm\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.259234 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2jpl\" (UniqueName: \"kubernetes.io/projected/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-api-access-k2jpl\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.259270 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.261215 4958 generic.go:334] "Generic (PLEG): container finished" podID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerID="be4dcfae53ebf0f5679008f3ab2b2488e6eec364d8b4f3e7506d7a1e58ea2248" exitCode=0 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.266040 4958 generic.go:334] "Generic (PLEG): container finished" podID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" containerID="b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b" exitCode=0 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.266467 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-scripts" (OuterVolumeSpecName: "scripts") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.271965 4958 generic.go:334] "Generic (PLEG): container finished" podID="79797aa2-db7b-429a-91d9-3e181de3976c" containerID="a6094764dd390e4786a93ea3889e9a8d68d2c1d8357ff0e022f148ebe28f5c52" exitCode=0 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.281607 4958 generic.go:334] "Generic (PLEG): container finished" podID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerID="9edf6c1c15c8c61a73a71e42e83b06f5d03b0b267c8f2a39955fe70dd308453e" exitCode=0 Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.281755 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.281773 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1865-account-delete-hmcpq" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.282491 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d6888456-hv67t" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.339466 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.383727 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.384282 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.385578 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.396258 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.448344 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.449823 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-config-data" (OuterVolumeSpecName: "config-data") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.478802 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4ce268c-ffb9-4eae-93e5-23a10ba96185" (UID: "a4ce268c-ffb9-4eae-93e5-23a10ba96185"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.488821 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.488896 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.488910 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.488922 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.488942 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.496150 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.520742 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "a4ce268c-ffb9-4eae-93e5-23a10ba96185" (UID: "a4ce268c-ffb9-4eae-93e5-23a10ba96185"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.527135 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.547399 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-config-data" (OuterVolumeSpecName: "config-data") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.601784 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.601856 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.601877 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.601920 4958 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.642989 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0353bee2-4033-4493-9217-b5c4600d3d90" (UID: "0353bee2-4033-4493-9217-b5c4600d3d90"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.662637 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-config-data" (OuterVolumeSpecName: "config-data") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.666722 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.679276 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "207148af-4b76-49b6-80cc-883ec14bb268" (UID: "207148af-4b76-49b6-80cc-883ec14bb268"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.685744 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.687042 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.704246 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.704296 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0353bee2-4033-4493-9217-b5c4600d3d90-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.704311 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.704320 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.704330 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.704339 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/207148af-4b76-49b6-80cc-883ec14bb268-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.714532 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.718534 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.719365 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "a4ce268c-ffb9-4eae-93e5-23a10ba96185" (UID: "a4ce268c-ffb9-4eae-93e5-23a10ba96185"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.720287 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.720386 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="de0d4422-f9bb-4abf-8727-066813a9182e" containerName="nova-scheduler-scheduler" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.721839 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data" (OuterVolumeSpecName: "config-data") pod "137d864e-34d9-452c-91b0-179a93198b0f" (UID: "137d864e-34d9-452c-91b0-179a93198b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.723217 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d93b36ff-312b-40e1-9e3a-b1981800da66" (UID: "d93b36ff-312b-40e1-9e3a-b1981800da66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.813745 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93b36ff-312b-40e1-9e3a-b1981800da66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.813811 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/137d864e-34d9-452c-91b0-179a93198b0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: I1201 10:26:36.813826 4958 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ce268c-ffb9-4eae-93e5-23a10ba96185-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.916144 4958 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.916234 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:44.916212576 +0000 UTC m=+1652.425001613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-scripts" not found Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.916280 4958 secret.go:188] Couldn't get secret openstack/glance-default-external-config-data: secret "glance-default-external-config-data" not found Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.916301 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data podName:30e8c723-a7a0-4697-8369-bd224fcfdf3f nodeName:}" failed. No retries permitted until 2025-12-01 10:26:44.916295468 +0000 UTC m=+1652.425084495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data") pod "glance-default-external-api-0" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f") : secret "glance-default-external-config-data" not found Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.916336 4958 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 01 10:26:36 crc kubenswrapper[4958]: E1201 10:26:36.916354 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data podName:4fccd607-3bfb-4593-a6de-6a0fc52b34ea nodeName:}" failed. No retries permitted until 2025-12-01 10:26:44.91634868 +0000 UTC m=+1652.425137717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data") pod "rabbitmq-server-0" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea") : configmap "rabbitmq-config-data" not found Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.086910 4958 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.282s" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087467 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder259f-account-delete-zpd7t"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087499 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder259f-account-delete-zpd7t"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087525 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744fbbd578-7c5pc" event={"ID":"b727a711-6b0b-44c6-917a-602f10dd0d6c","Type":"ContainerDied","Data":"8028a8027b503ff44704b40b230744482e49d1075b14eaa6e9f6e26b94442df3"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087564 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8028a8027b503ff44704b40b230744482e49d1075b14eaa6e9f6e26b94442df3" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087580 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"113fc18e-8eb6-45a5-9625-1404f4d832c4","Type":"ContainerDied","Data":"973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087630 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"137d864e-34d9-452c-91b0-179a93198b0f","Type":"ContainerDied","Data":"243b5df147c66a046aab1ae08bb1828983fbea8e9dcb142a19c667c9ae774be5"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087650 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrona095-account-delete-wwt74" event={"ID":"34129d60-4706-41fe-aa19-f2a13f38713a","Type":"ContainerDied","Data":"b00a8d940cff610a1acd971604c1e00ecd082786945ae6a350a4ee8b6a4d3d34"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087666 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerDied","Data":"3fbf7d32df1fc1b787081dc2dd704d72d22c756d3674cb93e42c1d2472d4bef9"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087685 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a4ce268c-ffb9-4eae-93e5-23a10ba96185","Type":"ContainerDied","Data":"cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087702 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a4ce268c-ffb9-4eae-93e5-23a10ba96185","Type":"ContainerDied","Data":"f9534d967249eaf5c794a26a6460d4a7182d9f383f0186adf6136ec7ddc6f49a"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087715 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0353bee2-4033-4493-9217-b5c4600d3d90","Type":"ContainerDied","Data":"c91bcad9a74692806a75938fd71bbdd6bb01e025193303b768a42f67d9a38201"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087732 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14","Type":"ContainerDied","Data":"be4dcfae53ebf0f5679008f3ab2b2488e6eec364d8b4f3e7506d7a1e58ea2248"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087749 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6","Type":"ContainerDied","Data":"b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087766 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79797aa2-db7b-429a-91d9-3e181de3976c","Type":"ContainerDied","Data":"a6094764dd390e4786a93ea3889e9a8d68d2c1d8357ff0e022f148ebe28f5c52"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.087782 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa361886-e7eb-413c-a4b4-cedaf1c86983","Type":"ContainerDied","Data":"9edf6c1c15c8c61a73a71e42e83b06f5d03b0b267c8f2a39955fe70dd308453e"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.118298 4958 scope.go:117] "RemoveContainer" containerID="fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.119446 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.140382 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.163201 4958 scope.go:117] "RemoveContainer" containerID="9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.164538 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5\": container with ID starting with 9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5 not found: ID does not exist" containerID="9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.164615 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5"} err="failed to get container status \"9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5\": rpc error: code = NotFound desc = could not find container \"9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5\": container with ID starting with 9558003a0f0dbb9592dcb7cf64ce6596694eda3a98ea3abc65fe71be651100a5 not found: ID does not exist" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.164650 4958 scope.go:117] "RemoveContainer" containerID="fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.165106 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001\": container with ID starting with fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001 not found: ID does not exist" containerID="fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.165234 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001"} err="failed to get container status \"fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001\": rpc error: code = NotFound desc = could not find container \"fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001\": container with ID starting with fe80751034968323a5dbd3a1f0b90c563022a0a7aefe1996053e0ce6fa5bd001 not found: ID does not exist" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.165314 4958 scope.go:117] "RemoveContainer" containerID="65331563d699edd62e898ec079b7208d63c4336ed333b4a2cad00c932ee16ea5" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225231 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-public-tls-certs\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225327 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-combined-ca-bundle\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225371 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-internal-tls-certs\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225396 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b727a711-6b0b-44c6-917a-602f10dd0d6c-logs\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225433 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-nova-metadata-tls-certs\") pod \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225469 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data-custom\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225492 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-combined-ca-bundle\") pod \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225524 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-logs\") pod \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225544 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225565 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8szj4\" (UniqueName: \"kubernetes.io/projected/b727a711-6b0b-44c6-917a-602f10dd0d6c-kube-api-access-8szj4\") pod \"b727a711-6b0b-44c6-917a-602f10dd0d6c\" (UID: \"b727a711-6b0b-44c6-917a-602f10dd0d6c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225617 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2gbt\" (UniqueName: \"kubernetes.io/projected/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-kube-api-access-q2gbt\") pod \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.225685 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-config-data\") pod \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\" (UID: \"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.232303 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-logs" (OuterVolumeSpecName: "logs") pod "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" (UID: "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.233598 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.235300 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5ffd4b498c-jxtxd"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.237250 4958 scope.go:117] "RemoveContainer" containerID="0384d0b99f1562b0d40019ea6259c6fc06fa88bdab2bd09c91cd469a8040914c" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.241382 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b727a711-6b0b-44c6-917a-602f10dd0d6c-logs" (OuterVolumeSpecName: "logs") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.244578 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5ffd4b498c-jxtxd"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.263447 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.263626 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b727a711-6b0b-44c6-917a-602f10dd0d6c-kube-api-access-8szj4" (OuterVolumeSpecName: "kube-api-access-8szj4") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "kube-api-access-8szj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.270042 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-kube-api-access-q2gbt" (OuterVolumeSpecName: "kube-api-access-q2gbt") pod "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" (UID: "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14"). InnerVolumeSpecName "kube-api-access-q2gbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.286469 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.301382 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.311074 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.315347 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.316790 4958 scope.go:117] "RemoveContainer" containerID="b2c2ce181a7e61ce78834ea943cac16253a105ecdd868af50896db054a33e2cb" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.323875 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.329690 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-public-tls-certs\") pod \"79797aa2-db7b-429a-91d9-3e181de3976c\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.329805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgzsx\" (UniqueName: \"kubernetes.io/projected/79797aa2-db7b-429a-91d9-3e181de3976c-kube-api-access-pgzsx\") pod \"79797aa2-db7b-429a-91d9-3e181de3976c\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.329859 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data\") pod \"aa361886-e7eb-413c-a4b4-cedaf1c86983\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.329879 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wtjp\" (UniqueName: \"kubernetes.io/projected/aa361886-e7eb-413c-a4b4-cedaf1c86983-kube-api-access-8wtjp\") pod \"aa361886-e7eb-413c-a4b4-cedaf1c86983\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.329926 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-config-data\") pod \"79797aa2-db7b-429a-91d9-3e181de3976c\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330019 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-combined-ca-bundle\") pod \"aa361886-e7eb-413c-a4b4-cedaf1c86983\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330055 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa361886-e7eb-413c-a4b4-cedaf1c86983-etc-machine-id\") pod \"aa361886-e7eb-413c-a4b4-cedaf1c86983\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330139 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-internal-tls-certs\") pod \"79797aa2-db7b-429a-91d9-3e181de3976c\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330174 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79797aa2-db7b-429a-91d9-3e181de3976c-logs\") pod \"79797aa2-db7b-429a-91d9-3e181de3976c\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330228 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-scripts\") pod \"aa361886-e7eb-413c-a4b4-cedaf1c86983\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330262 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-combined-ca-bundle\") pod \"79797aa2-db7b-429a-91d9-3e181de3976c\" (UID: \"79797aa2-db7b-429a-91d9-3e181de3976c\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330289 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data-custom\") pod \"aa361886-e7eb-413c-a4b4-cedaf1c86983\" (UID: \"aa361886-e7eb-413c-a4b4-cedaf1c86983\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330726 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330742 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330755 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8szj4\" (UniqueName: \"kubernetes.io/projected/b727a711-6b0b-44c6-917a-602f10dd0d6c-kube-api-access-8szj4\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330766 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2gbt\" (UniqueName: \"kubernetes.io/projected/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-kube-api-access-q2gbt\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.330779 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b727a711-6b0b-44c6-917a-602f10dd0d6c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.340025 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.340140 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa361886-e7eb-413c-a4b4-cedaf1c86983-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aa361886-e7eb-413c-a4b4-cedaf1c86983" (UID: "aa361886-e7eb-413c-a4b4-cedaf1c86983"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.340316 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79797aa2-db7b-429a-91d9-3e181de3976c-logs" (OuterVolumeSpecName: "logs") pod "79797aa2-db7b-429a-91d9-3e181de3976c" (UID: "79797aa2-db7b-429a-91d9-3e181de3976c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.342884 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aa361886-e7eb-413c-a4b4-cedaf1c86983" (UID: "aa361886-e7eb-413c-a4b4-cedaf1c86983"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.345889 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.346914 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79797aa2-db7b-429a-91d9-3e181de3976c-kube-api-access-pgzsx" (OuterVolumeSpecName: "kube-api-access-pgzsx") pod "79797aa2-db7b-429a-91d9-3e181de3976c" (UID: "79797aa2-db7b-429a-91d9-3e181de3976c"). InnerVolumeSpecName "kube-api-access-pgzsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.358430 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"113fc18e-8eb6-45a5-9625-1404f4d832c4","Type":"ContainerDied","Data":"54a5314fe1f4d099fa8267176306ea6ab531f192e0c2df680daa0e2e98771027"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.358729 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.361190 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.363597 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-config-data" (OuterVolumeSpecName: "config-data") pod "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" (UID: "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.366698 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.386553 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa361886-e7eb-413c-a4b4-cedaf1c86983-kube-api-access-8wtjp" (OuterVolumeSpecName: "kube-api-access-8wtjp") pod "aa361886-e7eb-413c-a4b4-cedaf1c86983" (UID: "aa361886-e7eb-413c-a4b4-cedaf1c86983"). InnerVolumeSpecName "kube-api-access-8wtjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.392867 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-scripts" (OuterVolumeSpecName: "scripts") pod "aa361886-e7eb-413c-a4b4-cedaf1c86983" (UID: "aa361886-e7eb-413c-a4b4-cedaf1c86983"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.393553 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data" (OuterVolumeSpecName: "config-data") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.401094 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" (UID: "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.402488 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-config-data" (OuterVolumeSpecName: "config-data") pod "79797aa2-db7b-429a-91d9-3e181de3976c" (UID: "79797aa2-db7b-429a-91d9-3e181de3976c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.413363 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.432549 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrona095-account-delete-wwt74"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.442087 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-config-data\") pod \"113fc18e-8eb6-45a5-9625-1404f4d832c4\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.442227 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7sdc\" (UniqueName: \"kubernetes.io/projected/113fc18e-8eb6-45a5-9625-1404f4d832c4-kube-api-access-w7sdc\") pod \"113fc18e-8eb6-45a5-9625-1404f4d832c4\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.442389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-combined-ca-bundle\") pod \"113fc18e-8eb6-45a5-9625-1404f4d832c4\" (UID: \"113fc18e-8eb6-45a5-9625-1404f4d832c4\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.449036 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa361886-e7eb-413c-a4b4-cedaf1c86983-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.454581 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79797aa2-db7b-429a-91d9-3e181de3976c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457011 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457122 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457225 4958 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457294 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457357 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457421 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgzsx\" (UniqueName: \"kubernetes.io/projected/79797aa2-db7b-429a-91d9-3e181de3976c-kube-api-access-pgzsx\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457477 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wtjp\" (UniqueName: \"kubernetes.io/projected/aa361886-e7eb-413c-a4b4-cedaf1c86983-kube-api-access-8wtjp\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457568 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.457635 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.449605 4958 generic.go:334] "Generic (PLEG): container finished" podID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerID="846a70f76927873aca913fa654d135f5d9f7402003f4d45aa4fb853953fe65db" exitCode=0 Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.449663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85fc6f9f59-gdn47" event={"ID":"81c0931c-8919-4128-97bb-21c5872d5cf0","Type":"ContainerDied","Data":"846a70f76927873aca913fa654d135f5d9f7402003f4d45aa4fb853953fe65db"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.458089 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85fc6f9f59-gdn47" event={"ID":"81c0931c-8919-4128-97bb-21c5872d5cf0","Type":"ContainerDied","Data":"e01ee9d2f2724ce03bb863b7e906c8b8b943bc2e3e319467052a9568c176065d"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.458137 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e01ee9d2f2724ce03bb863b7e906c8b8b943bc2e3e319467052a9568c176065d" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.458167 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutrona095-account-delete-wwt74"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.465121 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.467749 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d6888456-hv67t"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.470542 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113fc18e-8eb6-45a5-9625-1404f4d832c4-kube-api-access-w7sdc" (OuterVolumeSpecName: "kube-api-access-w7sdc") pod "113fc18e-8eb6-45a5-9625-1404f4d832c4" (UID: "113fc18e-8eb6-45a5-9625-1404f4d832c4"). InnerVolumeSpecName "kube-api-access-w7sdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.472746 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.480096 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.482881 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8d6888456-hv67t"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.489619 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.492677 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69ae2c66-9d6d-4bc0-b3fe-ee729225e85f/ovn-northd/0.log" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.492714 4958 generic.go:334] "Generic (PLEG): container finished" podID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" exitCode=139 Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.492741 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi1865-account-delete-hmcpq"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.492813 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f","Type":"ContainerDied","Data":"4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.501932 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79797aa2-db7b-429a-91d9-3e181de3976c" (UID: "79797aa2-db7b-429a-91d9-3e181de3976c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.507053 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi1865-account-delete-hmcpq"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.536740 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" (UID: "37ea0b7c-fb6b-4802-9fff-4e3995c0ed14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.540759 4958 generic.go:334] "Generic (PLEG): container finished" podID="ceca985e-bec2-42fe-9758-edad828586c2" containerID="beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb" exitCode=0 Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.540886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" event={"ID":"ceca985e-bec2-42fe-9758-edad828586c2","Type":"ContainerDied","Data":"beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.540933 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" event={"ID":"ceca985e-bec2-42fe-9758-edad828586c2","Type":"ContainerDied","Data":"ee16d5363237e063ef2898c63d232312cd6a180134043c69dc9de164d7e26bee"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.541030 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.546033 4958 scope.go:117] "RemoveContainer" containerID="cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.548745 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30e8c723-a7a0-4697-8369-bd224fcfdf3f","Type":"ContainerDied","Data":"59f7d5726e3c9d6cd29a48f19e5d46961290d8f324c2aabd710ec135f667e5f3"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.548799 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558511 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-combined-ca-bundle\") pod \"81c0931c-8919-4128-97bb-21c5872d5cf0\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558557 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data\") pod \"81c0931c-8919-4128-97bb-21c5872d5cf0\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558606 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558629 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2rnm\" (UniqueName: \"kubernetes.io/projected/03af271e-0af4-4681-a7b6-31b207d21143-kube-api-access-d2rnm\") pod \"03af271e-0af4-4681-a7b6-31b207d21143\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558649 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-memcached-tls-certs\") pod \"03af271e-0af4-4681-a7b6-31b207d21143\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558671 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data\") pod \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558713 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-logs\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.558755 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-combined-ca-bundle\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560168 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data\") pod \"ceca985e-bec2-42fe-9758-edad828586c2\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560230 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560309 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-combined-ca-bundle\") pod \"03af271e-0af4-4681-a7b6-31b207d21143\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560342 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm594\" (UniqueName: \"kubernetes.io/projected/ceca985e-bec2-42fe-9758-edad828586c2-kube-api-access-nm594\") pod \"ceca985e-bec2-42fe-9758-edad828586c2\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560409 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-public-tls-certs\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560435 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"aa361886-e7eb-413c-a4b4-cedaf1c86983","Type":"ContainerDied","Data":"c852bf51f34c89850c8ffbbf2aea50bb85f5125da81ea0979ef51563645afa6a"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560476 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c0931c-8919-4128-97bb-21c5872d5cf0-logs\") pod \"81c0931c-8919-4128-97bb-21c5872d5cf0\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560500 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbfls\" (UniqueName: \"kubernetes.io/projected/81c0931c-8919-4128-97bb-21c5872d5cf0-kube-api-access-jbfls\") pod \"81c0931c-8919-4128-97bb-21c5872d5cf0\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560547 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-config-data\") pod \"03af271e-0af4-4681-a7b6-31b207d21143\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560582 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-kolla-config\") pod \"03af271e-0af4-4681-a7b6-31b207d21143\" (UID: \"03af271e-0af4-4681-a7b6-31b207d21143\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560650 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data-custom\") pod \"ceca985e-bec2-42fe-9758-edad828586c2\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560711 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-combined-ca-bundle\") pod \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560754 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qhrx\" (UniqueName: \"kubernetes.io/projected/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-kube-api-access-2qhrx\") pod \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\" (UID: \"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560792 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-combined-ca-bundle\") pod \"ceca985e-bec2-42fe-9758-edad828586c2\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560834 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data-custom\") pod \"81c0931c-8919-4128-97bb-21c5872d5cf0\" (UID: \"81c0931c-8919-4128-97bb-21c5872d5cf0\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561106 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceca985e-bec2-42fe-9758-edad828586c2-logs\") pod \"ceca985e-bec2-42fe-9758-edad828586c2\" (UID: \"ceca985e-bec2-42fe-9758-edad828586c2\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561177 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t556\" (UniqueName: \"kubernetes.io/projected/30e8c723-a7a0-4697-8369-bd224fcfdf3f-kube-api-access-7t556\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561231 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-httpd-run\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561294 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts\") pod \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\" (UID: \"30e8c723-a7a0-4697-8369-bd224fcfdf3f\") " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561806 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7sdc\" (UniqueName: \"kubernetes.io/projected/113fc18e-8eb6-45a5-9625-1404f4d832c4-kube-api-access-w7sdc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561829 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.561860 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.576243 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-logs" (OuterVolumeSpecName: "logs") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.576314 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceca985e-bec2-42fe-9758-edad828586c2-logs" (OuterVolumeSpecName: "logs") pod "ceca985e-bec2-42fe-9758-edad828586c2" (UID: "ceca985e-bec2-42fe-9758-edad828586c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.578217 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.560576 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.578662 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c0931c-8919-4128-97bb-21c5872d5cf0-logs" (OuterVolumeSpecName: "logs") pod "81c0931c-8919-4128-97bb-21c5872d5cf0" (UID: "81c0931c-8919-4128-97bb-21c5872d5cf0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.578998 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "03af271e-0af4-4681-a7b6-31b207d21143" (UID: "03af271e-0af4-4681-a7b6-31b207d21143"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.580293 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-config-data" (OuterVolumeSpecName: "config-data") pod "03af271e-0af4-4681-a7b6-31b207d21143" (UID: "03af271e-0af4-4681-a7b6-31b207d21143"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.582121 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6","Type":"ContainerDied","Data":"d882b86fe2faf9de13ad90313437bba40d0c05bb0859e452a8ced38bfb65f447"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.582281 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.590381 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.590549 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "81c0931c-8919-4128-97bb-21c5872d5cf0" (UID: "81c0931c-8919-4128-97bb-21c5872d5cf0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.591277 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37ea0b7c-fb6b-4802-9fff-4e3995c0ed14","Type":"ContainerDied","Data":"5df8b89b368ce2bb3715c01ac36fd1cc19440e1747577a5d2b98f0687ec1ba5c"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.591413 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.592350 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03af271e-0af4-4681-a7b6-31b207d21143-kube-api-access-d2rnm" (OuterVolumeSpecName: "kube-api-access-d2rnm") pod "03af271e-0af4-4681-a7b6-31b207d21143" (UID: "03af271e-0af4-4681-a7b6-31b207d21143"). InnerVolumeSpecName "kube-api-access-d2rnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.593083 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceca985e-bec2-42fe-9758-edad828586c2-kube-api-access-nm594" (OuterVolumeSpecName: "kube-api-access-nm594") pod "ceca985e-bec2-42fe-9758-edad828586c2" (UID: "ceca985e-bec2-42fe-9758-edad828586c2"). InnerVolumeSpecName "kube-api-access-nm594". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.593298 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts" (OuterVolumeSpecName: "scripts") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.600235 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ceca985e-bec2-42fe-9758-edad828586c2" (UID: "ceca985e-bec2-42fe-9758-edad828586c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.613315 4958 scope.go:117] "RemoveContainer" containerID="cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.614676 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa\": container with ID starting with cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa not found: ID does not exist" containerID="cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.614763 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa"} err="failed to get container status \"cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa\": rpc error: code = NotFound desc = could not find container \"cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa\": container with ID starting with cfa8bb4cfda0cf43fd73a76f724816ad6f496fee99ffeab64a684c098a360ffa not found: ID does not exist" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.614799 4958 scope.go:117] "RemoveContainer" containerID="2f99a0981f6f8c18ad3b38420c231b904981b9a55694320467486987165f628d" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.618995 4958 generic.go:334] "Generic (PLEG): container finished" podID="03af271e-0af4-4681-a7b6-31b207d21143" containerID="4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4" exitCode=0 Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.619121 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03af271e-0af4-4681-a7b6-31b207d21143","Type":"ContainerDied","Data":"4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.619146 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.619163 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"03af271e-0af4-4681-a7b6-31b207d21143","Type":"ContainerDied","Data":"b4104edfeb976d45810f49a7db34aabba86b48b811bd03a90c03d79c8d64492c"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.624517 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e8c723-a7a0-4697-8369-bd224fcfdf3f-kube-api-access-7t556" (OuterVolumeSpecName: "kube-api-access-7t556") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "kube-api-access-7t556". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.624797 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-kube-api-access-2qhrx" (OuterVolumeSpecName: "kube-api-access-2qhrx") pod "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6"). InnerVolumeSpecName "kube-api-access-2qhrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.634611 4958 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 01 10:26:37 crc kubenswrapper[4958]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-01T10:26:30Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 01 10:26:37 crc kubenswrapper[4958]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Dec 01 10:26:37 crc kubenswrapper[4958]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-d9qck" message=< Dec 01 10:26:37 crc kubenswrapper[4958]: Exiting ovn-controller (1) [FAILED] Dec 01 10:26:37 crc kubenswrapper[4958]: Killing ovn-controller (1) [ OK ] Dec 01 10:26:37 crc kubenswrapper[4958]: Killing ovn-controller (1) with SIGKILL [ OK ] Dec 01 10:26:37 crc kubenswrapper[4958]: 2025-12-01T10:26:30Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 01 10:26:37 crc kubenswrapper[4958]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Dec 01 10:26:37 crc kubenswrapper[4958]: > Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.634686 4958 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 01 10:26:37 crc kubenswrapper[4958]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-01T10:26:30Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 01 10:26:37 crc kubenswrapper[4958]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Dec 01 10:26:37 crc kubenswrapper[4958]: > pod="openstack/ovn-controller-d9qck" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" containerID="cri-o://fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.634733 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-d9qck" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" containerID="cri-o://fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86" gracePeriod=21 Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.638115 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.638131 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79797aa2-db7b-429a-91d9-3e181de3976c","Type":"ContainerDied","Data":"638a7d1084fba5123fe154909b77078a77a1032d920622d2ef416d3afc8eb68d"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.653010 4958 generic.go:334] "Generic (PLEG): container finished" podID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerID="f28e0a79ecd1735ef6ab0d156677914d027c13581eb5f43efb38a35773f7c70d" exitCode=0 Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.653095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0","Type":"ContainerDied","Data":"f28e0a79ecd1735ef6ab0d156677914d027c13581eb5f43efb38a35773f7c70d"} Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.653238 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744fbbd578-7c5pc" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665876 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qhrx\" (UniqueName: \"kubernetes.io/projected/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-kube-api-access-2qhrx\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665909 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665921 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceca985e-bec2-42fe-9758-edad828586c2-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665955 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t556\" (UniqueName: \"kubernetes.io/projected/30e8c723-a7a0-4697-8369-bd224fcfdf3f-kube-api-access-7t556\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665966 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665978 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665987 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2rnm\" (UniqueName: \"kubernetes.io/projected/03af271e-0af4-4681-a7b6-31b207d21143-kube-api-access-d2rnm\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.665997 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30e8c723-a7a0-4697-8369-bd224fcfdf3f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.666043 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.666055 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm594\" (UniqueName: \"kubernetes.io/projected/ceca985e-bec2-42fe-9758-edad828586c2-kube-api-access-nm594\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.666065 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81c0931c-8919-4128-97bb-21c5872d5cf0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.666075 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.666086 4958 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03af271e-0af4-4681-a7b6-31b207d21143-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.666115 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.682863 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.708390 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.722830 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c0931c-8919-4128-97bb-21c5872d5cf0-kube-api-access-jbfls" (OuterVolumeSpecName: "kube-api-access-jbfls") pod "81c0931c-8919-4128-97bb-21c5872d5cf0" (UID: "81c0931c-8919-4128-97bb-21c5872d5cf0"). InnerVolumeSpecName "kube-api-access-jbfls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.725913 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.871723 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.871807 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbfls\" (UniqueName: \"kubernetes.io/projected/81c0931c-8919-4128-97bb-21c5872d5cf0-kube-api-access-jbfls\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.883710 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a is running failed: container process not found" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.884617 4958 scope.go:117] "RemoveContainer" containerID="7f2380f1568f2f9c175fd6ffac5bc9a4006ac5eca2dd568b473ca89bfac956f6" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.886430 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a is running failed: container process not found" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.888494 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data" (OuterVolumeSpecName: "config-data") pod "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.893383 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a is running failed: container process not found" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 01 10:26:37 crc kubenswrapper[4958]: E1201 10:26:37.893503 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.893749 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" path="/var/lib/kubelet/pods/0353bee2-4033-4493-9217-b5c4600d3d90/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.896101 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137d864e-34d9-452c-91b0-179a93198b0f" path="/var/lib/kubelet/pods/137d864e-34d9-452c-91b0-179a93198b0f/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.896882 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="207148af-4b76-49b6-80cc-883ec14bb268" path="/var/lib/kubelet/pods/207148af-4b76-49b6-80cc-883ec14bb268/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.898878 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34129d60-4706-41fe-aa19-f2a13f38713a" path="/var/lib/kubelet/pods/34129d60-4706-41fe-aa19-f2a13f38713a/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.900224 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" path="/var/lib/kubelet/pods/37ea0b7c-fb6b-4802-9fff-4e3995c0ed14/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.902395 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66469344-8c32-45d9-afc4-91dcb9dbe807" path="/var/lib/kubelet/pods/66469344-8c32-45d9-afc4-91dcb9dbe807/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.903467 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" path="/var/lib/kubelet/pods/a4ce268c-ffb9-4eae-93e5-23a10ba96185/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.904119 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" path="/var/lib/kubelet/pods/d93b36ff-312b-40e1-9e3a-b1981800da66/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.905394 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20641f2-49c0-4492-8f1f-20b14a1f3bd3" path="/var/lib/kubelet/pods/f20641f2-49c0-4492-8f1f-20b14a1f3bd3/volumes" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.908128 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81c0931c-8919-4128-97bb-21c5872d5cf0" (UID: "81c0931c-8919-4128-97bb-21c5872d5cf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.908416 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.916040 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.925410 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "79797aa2-db7b-429a-91d9-3e181de3976c" (UID: "79797aa2-db7b-429a-91d9-3e181de3976c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.944110 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-config-data" (OuterVolumeSpecName: "config-data") pod "113fc18e-8eb6-45a5-9625-1404f4d832c4" (UID: "113fc18e-8eb6-45a5-9625-1404f4d832c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.944320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "113fc18e-8eb6-45a5-9625-1404f4d832c4" (UID: "113fc18e-8eb6-45a5-9625-1404f4d832c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.945930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa361886-e7eb-413c-a4b4-cedaf1c86983" (UID: "aa361886-e7eb-413c-a4b4-cedaf1c86983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.947744 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" (UID: "8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.951409 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ceca985e-bec2-42fe-9758-edad828586c2" (UID: "ceca985e-bec2-42fe-9758-edad828586c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.964048 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data" (OuterVolumeSpecName: "config-data") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.983295 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03af271e-0af4-4681-a7b6-31b207d21143" (UID: "03af271e-0af4-4681-a7b6-31b207d21143"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.985317 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b727a711-6b0b-44c6-917a-602f10dd0d6c" (UID: "b727a711-6b0b-44c6-917a-602f10dd0d6c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.992598 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "79797aa2-db7b-429a-91d9-3e181de3976c" (UID: "79797aa2-db7b-429a-91d9-3e181de3976c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.992992 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30e8c723-a7a0-4697-8369-bd224fcfdf3f" (UID: "30e8c723-a7a0-4697-8369-bd224fcfdf3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.994930 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995019 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995047 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995061 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995074 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995086 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995102 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995113 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995125 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/113fc18e-8eb6-45a5-9625-1404f4d832c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995144 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:37 crc kubenswrapper[4958]: I1201 10:26:37.995155 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.034518 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data" (OuterVolumeSpecName: "config-data") pod "aa361886-e7eb-413c-a4b4-cedaf1c86983" (UID: "aa361886-e7eb-413c-a4b4-cedaf1c86983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.057987 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data" (OuterVolumeSpecName: "config-data") pod "81c0931c-8919-4128-97bb-21c5872d5cf0" (UID: "81c0931c-8919-4128-97bb-21c5872d5cf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.061477 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "03af271e-0af4-4681-a7b6-31b207d21143" (UID: "03af271e-0af4-4681-a7b6-31b207d21143"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.061663 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data" (OuterVolumeSpecName: "config-data") pod "ceca985e-bec2-42fe-9758-edad828586c2" (UID: "ceca985e-bec2-42fe-9758-edad828586c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097561 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79797aa2-db7b-429a-91d9-3e181de3976c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097610 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c0931c-8919-4128-97bb-21c5872d5cf0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097622 4958 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097636 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b727a711-6b0b-44c6-917a-602f10dd0d6c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097649 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceca985e-bec2-42fe-9758-edad828586c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097665 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03af271e-0af4-4681-a7b6-31b207d21143-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097682 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e8c723-a7a0-4697-8369-bd224fcfdf3f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.097693 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa361886-e7eb-413c-a4b4-cedaf1c86983-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.266310 4958 scope.go:117] "RemoveContainer" containerID="973323edcddc9628e51fba2a4a4b7496b7b376f89fbb0037cf0dafdcf8686936" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.276193 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.420743 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69ae2c66-9d6d-4bc0-b3fe-ee729225e85f/ovn-northd/0.log" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.420911 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.422770 4958 scope.go:117] "RemoveContainer" containerID="beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.452629 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-plugins-conf\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.452706 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.452833 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-server-conf\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.452967 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-confd\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453025 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-plugins\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453304 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-erlang-cookie\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453416 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-pod-info\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453492 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-tls\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453538 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw5mk\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-kube-api-access-jw5mk\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453578 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-erlang-cookie-secret\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.453624 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data\") pod \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\" (UID: \"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.458864 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.460664 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.472987 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-kube-api-access-jw5mk" (OuterVolumeSpecName: "kube-api-access-jw5mk") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "kube-api-access-jw5mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.475704 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.477123 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-pod-info" (OuterVolumeSpecName: "pod-info") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.479143 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.458328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.483438 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.494350 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data" (OuterVolumeSpecName: "config-data") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.555217 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-northd-tls-certs\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.556520 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-scripts\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.556559 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-config\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.556643 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-rundir\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.556714 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-metrics-certs-tls-certs\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.556791 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-combined-ca-bundle\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.556880 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6df4\" (UniqueName: \"kubernetes.io/projected/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-kube-api-access-v6df4\") pod \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\" (UID: \"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557486 4958 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557507 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557497 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557521 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw5mk\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-kube-api-access-jw5mk\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557593 4958 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557607 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557618 4958 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557650 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557661 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.557673 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.560662 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-scripts" (OuterVolumeSpecName: "scripts") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.560906 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-config" (OuterVolumeSpecName: "config") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.571630 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-kube-api-access-v6df4" (OuterVolumeSpecName: "kube-api-access-v6df4") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "kube-api-access-v6df4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.574292 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-server-conf" (OuterVolumeSpecName: "server-conf") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.592123 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.602941 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" (UID: "e19ffea8-2e96-4cff-a2ec-40646aaa4cc0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.614345 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.659792 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660270 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660286 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6df4\" (UniqueName: \"kubernetes.io/projected/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-kube-api-access-v6df4\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660303 4958 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660318 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660333 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660351 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.660365 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.673302 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.675658 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.676081 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.677493 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-d9qck_d13e880d-3817-4df9-8477-82349d7979b9/ovn-controller/0.log" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.677673 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.687194 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.707870 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.707948 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6ccf6d66f4-d9wbd"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.716000 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.738472 4958 scope.go:117] "RemoveContainer" containerID="d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.746617 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.754810 4958 generic.go:334] "Generic (PLEG): container finished" podID="46e6b589-937c-42c8-8004-49e39813d622" containerID="2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc" exitCode=0 Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.754890 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.754919 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"46e6b589-937c-42c8-8004-49e39813d622","Type":"ContainerDied","Data":"2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.755719 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"46e6b589-937c-42c8-8004-49e39813d622","Type":"ContainerDied","Data":"759868a2b94ce22deec852fe7f7bb39d12c949b87b6fcbb91e5bc25c27c75d38"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.759339 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761238 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-secrets\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761417 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761563 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-log-ovn\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761675 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46e6b589-937c-42c8-8004-49e39813d622-config-data-generated\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761837 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75px8\" (UniqueName: \"kubernetes.io/projected/46e6b589-937c-42c8-8004-49e39813d622-kube-api-access-75px8\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761940 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-ovn-controller-tls-certs\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762030 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-operator-scripts\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762150 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13e880d-3817-4df9-8477-82349d7979b9-scripts\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762257 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-combined-ca-bundle\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762398 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762644 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-combined-ca-bundle\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762859 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-galera-tls-certs\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762978 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run-ovn\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.761958 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" (UID: "69ae2c66-9d6d-4bc0-b3fe-ee729225e85f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.762182 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.763332 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-kolla-config\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.763461 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4677l\" (UniqueName: \"kubernetes.io/projected/d13e880d-3817-4df9-8477-82349d7979b9-kube-api-access-4677l\") pod \"d13e880d-3817-4df9-8477-82349d7979b9\" (UID: \"d13e880d-3817-4df9-8477-82349d7979b9\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.763662 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-config-data-default\") pod \"46e6b589-937c-42c8-8004-49e39813d622\" (UID: \"46e6b589-937c-42c8-8004-49e39813d622\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.764810 4958 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.764973 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.765321 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.771779 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.772670 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46e6b589-937c-42c8-8004-49e39813d622-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.773338 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-secrets" (OuterVolumeSpecName: "secrets") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.774020 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run" (OuterVolumeSpecName: "var-run") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.774077 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775101 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-d9qck_d13e880d-3817-4df9-8477-82349d7979b9/ovn-controller/0.log" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775141 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775169 4958 generic.go:334] "Generic (PLEG): container finished" podID="d13e880d-3817-4df9-8477-82349d7979b9" containerID="fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86" exitCode=137 Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775289 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck" event={"ID":"d13e880d-3817-4df9-8477-82349d7979b9","Type":"ContainerDied","Data":"fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775350 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-d9qck" event={"ID":"d13e880d-3817-4df9-8477-82349d7979b9","Type":"ContainerDied","Data":"6a491b43154cbdc222a2f41fbd71eb15ca75f7155e91f5ceae0b2424285a1200"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775450 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-d9qck" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.775629 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.776568 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d13e880d-3817-4df9-8477-82349d7979b9-scripts" (OuterVolumeSpecName: "scripts") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.776824 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.780651 4958 generic.go:334] "Generic (PLEG): container finished" podID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerID="79456c806f5f410ac0967e765960f7c80fe55e590cd1656e2879adefde292303" exitCode=0 Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.780726 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4fccd607-3bfb-4593-a6de-6a0fc52b34ea","Type":"ContainerDied","Data":"79456c806f5f410ac0967e765960f7c80fe55e590cd1656e2879adefde292303"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.781697 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.783393 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e6b589-937c-42c8-8004-49e39813d622-kube-api-access-75px8" (OuterVolumeSpecName: "kube-api-access-75px8") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "kube-api-access-75px8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.784628 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13e880d-3817-4df9-8477-82349d7979b9-kube-api-access-4677l" (OuterVolumeSpecName: "kube-api-access-4677l") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "kube-api-access-4677l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.794982 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.795112 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e19ffea8-2e96-4cff-a2ec-40646aaa4cc0","Type":"ContainerDied","Data":"cb063671ef5138a217b59e52fc226022685432b2509b16829b1a0976c6491c51"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.795273 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.806276 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.812912 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_69ae2c66-9d6d-4bc0-b3fe-ee729225e85f/ovn-northd/0.log" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.813142 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"69ae2c66-9d6d-4bc0-b3fe-ee729225e85f","Type":"ContainerDied","Data":"9bd9d65cd1a5ac8d4faaa84e09d80739d7830091d82598bf1dba2eee246b3ce6"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.813361 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.820099 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f6877488-9nzb2" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.820329 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f6877488-9nzb2" event={"ID":"caf6f35f-9e82-4899-801f-3b5e94189f6d","Type":"ContainerDied","Data":"f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.819764 4958 generic.go:334] "Generic (PLEG): container finished" podID="caf6f35f-9e82-4899-801f-3b5e94189f6d" containerID="f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9" exitCode=0 Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.831288 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f6877488-9nzb2" event={"ID":"caf6f35f-9e82-4899-801f-3b5e94189f6d","Type":"ContainerDied","Data":"b3d19fa61514911c622a80f5db474f7b28764084c4fc66e35c69fb5f73b5a004"} Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.845086 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.848080 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85fc6f9f59-gdn47" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.867730 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-config-data\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.867860 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-internal-tls-certs\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.867921 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-public-tls-certs\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.868079 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-fernet-keys\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.868249 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-scripts\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.868344 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b5vj\" (UniqueName: \"kubernetes.io/projected/caf6f35f-9e82-4899-801f-3b5e94189f6d-kube-api-access-6b5vj\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.868518 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-credential-keys\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.868605 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-combined-ca-bundle\") pod \"caf6f35f-9e82-4899-801f-3b5e94189f6d\" (UID: \"caf6f35f-9e82-4899-801f-3b5e94189f6d\") " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869424 4958 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869463 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869477 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/46e6b589-937c-42c8-8004-49e39813d622-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869488 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75px8\" (UniqueName: \"kubernetes.io/projected/46e6b589-937c-42c8-8004-49e39813d622-kube-api-access-75px8\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869499 4958 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869512 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d13e880d-3817-4df9-8477-82349d7979b9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869522 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869531 4958 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869540 4958 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d13e880d-3817-4df9-8477-82349d7979b9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869555 4958 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869576 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4677l\" (UniqueName: \"kubernetes.io/projected/d13e880d-3817-4df9-8477-82349d7979b9-kube-api-access-4677l\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.869588 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/46e6b589-937c-42c8-8004-49e39813d622-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.880222 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.883425 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-scripts" (OuterVolumeSpecName: "scripts") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.887279 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.896536 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.925771 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.930828 4958 scope.go:117] "RemoveContainer" containerID="beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb" Dec 01 10:26:38 crc kubenswrapper[4958]: E1201 10:26:38.932215 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb\": container with ID starting with beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb not found: ID does not exist" containerID="beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.932282 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb"} err="failed to get container status \"beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb\": rpc error: code = NotFound desc = could not find container \"beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb\": container with ID starting with beb72f83d33471d431dae1e5fec0a30b928583fb28f59468f8008cfe43fce3bb not found: ID does not exist" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.932319 4958 scope.go:117] "RemoveContainer" containerID="d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.934451 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-config-data" (OuterVolumeSpecName: "config-data") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.937203 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf6f35f-9e82-4899-801f-3b5e94189f6d-kube-api-access-6b5vj" (OuterVolumeSpecName: "kube-api-access-6b5vj") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "kube-api-access-6b5vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.938295 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "46e6b589-937c-42c8-8004-49e39813d622" (UID: "46e6b589-937c-42c8-8004-49e39813d622"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: E1201 10:26:38.942136 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96\": container with ID starting with d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96 not found: ID does not exist" containerID="d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.942202 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96"} err="failed to get container status \"d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96\": rpc error: code = NotFound desc = could not find container \"d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96\": container with ID starting with d6bf9c119f4304f0d9ea18db7018ff4e63d2085fdfbdf9fc47deea0cf585ac96 not found: ID does not exist" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.942235 4958 scope.go:117] "RemoveContainer" containerID="9144cb56a423613b84457fe40b192f906f9bb58de4d5fb7b219e438f9744439a" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.947324 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971260 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971303 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971316 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b5vj\" (UniqueName: \"kubernetes.io/projected/caf6f35f-9e82-4899-801f-3b5e94189f6d-kube-api-access-6b5vj\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971330 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971347 4958 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/46e6b589-937c-42c8-8004-49e39813d622-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971358 4958 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971369 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.971380 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.981536 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "d13e880d-3817-4df9-8477-82349d7979b9" (UID: "d13e880d-3817-4df9-8477-82349d7979b9"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.983190 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:38 crc kubenswrapper[4958]: I1201 10:26:38.988692 4958 scope.go:117] "RemoveContainer" containerID="5df0d7f8f330d79dac217b44ff6f358c9415547f0327ca2889719598f61e6c85" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.016636 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-85fc6f9f59-gdn47"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.029722 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.040554 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-85fc6f9f59-gdn47"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.042127 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.045617 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "caf6f35f-9e82-4899-801f-3b5e94189f6d" (UID: "caf6f35f-9e82-4899-801f-3b5e94189f6d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.058886 4958 scope.go:117] "RemoveContainer" containerID="41cd43dae598fc0fb9dde4e0161ce7d640d805585cd73ef9e8521ddf5b485506" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.064044 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.073107 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.074367 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.074380 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf6f35f-9e82-4899-801f-3b5e94189f6d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.074396 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d13e880d-3817-4df9-8477-82349d7979b9-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.098409 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.116242 4958 scope.go:117] "RemoveContainer" containerID="9edf6c1c15c8c61a73a71e42e83b06f5d03b0b267c8f2a39955fe70dd308453e" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.119143 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.145763 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa361886_e7eb_413c_a4b4_cedaf1c86983.slice/crio-c852bf51f34c89850c8ffbbf2aea50bb85f5125da81ea0979ef51563645afa6a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa361886_e7eb_413c_a4b4_cedaf1c86983.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8477d4fe_3ea3_4bc3_a5e8_5a1aa6951ca6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81c0931c_8919_4128_97bb_21c5872d5cf0.slice/crio-e01ee9d2f2724ce03bb863b7e906c8b8b943bc2e3e319467052a9568c176065d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e8c723_a7a0_4697_8369_bd224fcfdf3f.slice/crio-59f7d5726e3c9d6cd29a48f19e5d46961290d8f324c2aabd710ec135f667e5f3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8477d4fe_3ea3_4bc3_a5e8_5a1aa6951ca6.slice/crio-d882b86fe2faf9de13ad90313437bba40d0c05bb0859e452a8ced38bfb65f447\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03af271e_0af4_4681_a7b6_31b207d21143.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79797aa2_db7b_429a_91d9_3e181de3976c.slice/crio-638a7d1084fba5123fe154909b77078a77a1032d920622d2ef416d3afc8eb68d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb727a711_6b0b_44c6_917a_602f10dd0d6c.slice/crio-8028a8027b503ff44704b40b230744482e49d1075b14eaa6e9f6e26b94442df3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fccd607_3bfb_4593_a6de_6a0fc52b34ea.slice/crio-conmon-79456c806f5f410ac0967e765960f7c80fe55e590cd1656e2879adefde292303.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ae2c66_9d6d_4bc0_b3fe_ee729225e85f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03af271e_0af4_4681_a7b6_31b207d21143.slice/crio-b4104edfeb976d45810f49a7db34aabba86b48b811bd03a90c03d79c8d64492c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19ffea8_2e96_4cff_a2ec_40646aaa4cc0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81c0931c_8919_4128_97bb_21c5872d5cf0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e8c723_a7a0_4697_8369_bd224fcfdf3f.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.153955 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.159249 4958 scope.go:117] "RemoveContainer" containerID="b3241a70bcec5ec3a9a78b2a90bc1e75503cd83921db60624e31f3720317ce5b" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175250 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-plugins-conf\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175320 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-plugins\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175367 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-erlang-cookie\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175430 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2bbc\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-kube-api-access-g2bbc\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175489 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-confd\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175577 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-server-conf\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175621 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175656 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-pod-info\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175693 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-tls\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175764 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.175806 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-erlang-cookie-secret\") pod \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\" (UID: \"4fccd607-3bfb-4593-a6de-6a0fc52b34ea\") " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.178372 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.178936 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.179546 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.181323 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-744fbbd578-7c5pc"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.207915 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-744fbbd578-7c5pc"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.211782 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.212149 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.212454 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-kube-api-access-g2bbc" (OuterVolumeSpecName: "kube-api-access-g2bbc") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "kube-api-access-g2bbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.213156 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-pod-info" (OuterVolumeSpecName: "pod-info") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.215895 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.216851 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.218170 4958 scope.go:117] "RemoveContainer" containerID="be4dcfae53ebf0f5679008f3ab2b2488e6eec364d8b4f3e7506d7a1e58ea2248" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.226647 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.233124 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data" (OuterVolumeSpecName: "config-data") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.234341 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-server-conf" (OuterVolumeSpecName: "server-conf") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.257021 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-d9qck"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.270030 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-d9qck"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.275867 4958 scope.go:117] "RemoveContainer" containerID="ca5f316669c8cd32825cef769b0e3e1c6eab56cc41d4a4cff5edd8cd75e0ad40" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278588 4958 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278624 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278637 4958 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278649 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278684 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278699 4958 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278713 4958 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278727 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278741 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.278753 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2bbc\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-kube-api-access-g2bbc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.280031 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5f6877488-9nzb2"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.299920 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5f6877488-9nzb2"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.300921 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.305786 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.312907 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.318161 4958 scope.go:117] "RemoveContainer" containerID="4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.348993 4958 scope.go:117] "RemoveContainer" containerID="4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.349465 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4fccd607-3bfb-4593-a6de-6a0fc52b34ea" (UID: "4fccd607-3bfb-4593-a6de-6a0fc52b34ea"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.349624 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4\": container with ID starting with 4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4 not found: ID does not exist" containerID="4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.349663 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4"} err="failed to get container status \"4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4\": rpc error: code = NotFound desc = could not find container \"4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4\": container with ID starting with 4703ea3b8368f81ab5256cdde59bdad7b14ea5d5b61cedec5360686c336d5bb4 not found: ID does not exist" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.349686 4958 scope.go:117] "RemoveContainer" containerID="a6094764dd390e4786a93ea3889e9a8d68d2c1d8357ff0e022f148ebe28f5c52" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.374576 4958 scope.go:117] "RemoveContainer" containerID="c5d92e349f45e4a0af02c0d2ddcebd7a3e9e9b1433ebbd079fd3f0a98c18c1b3" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.380453 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.380498 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fccd607-3bfb-4593-a6de-6a0fc52b34ea-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.401197 4958 scope.go:117] "RemoveContainer" containerID="2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.428389 4958 scope.go:117] "RemoveContainer" containerID="5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.461251 4958 scope.go:117] "RemoveContainer" containerID="2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.461982 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc\": container with ID starting with 2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc not found: ID does not exist" containerID="2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.462018 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc"} err="failed to get container status \"2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc\": rpc error: code = NotFound desc = could not find container \"2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc\": container with ID starting with 2ad714bd2234df6b38c1a86549393ee27d29b1dd419f67e7614dbc60f77966dc not found: ID does not exist" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.462049 4958 scope.go:117] "RemoveContainer" containerID="5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.462363 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862\": container with ID starting with 5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862 not found: ID does not exist" containerID="5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.462390 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862"} err="failed to get container status \"5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862\": rpc error: code = NotFound desc = could not find container \"5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862\": container with ID starting with 5eb2b4e830c413eddfc09e39969c0cdacdce600216e750fd9ece78362926b862 not found: ID does not exist" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.462403 4958 scope.go:117] "RemoveContainer" containerID="fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.501626 4958 scope.go:117] "RemoveContainer" containerID="fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.502520 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86\": container with ID starting with fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86 not found: ID does not exist" containerID="fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.502554 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86"} err="failed to get container status \"fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86\": rpc error: code = NotFound desc = could not find container \"fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86\": container with ID starting with fa4030b1ae32540c59bee557a9b70c49c21c0ef4ba7c627b01f2454936ca8d86 not found: ID does not exist" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.502587 4958 scope.go:117] "RemoveContainer" containerID="f28e0a79ecd1735ef6ab0d156677914d027c13581eb5f43efb38a35773f7c70d" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.520649 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.521063 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.521318 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.521356 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.537788 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.549435 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.557185 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.557335 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.559332 4958 scope.go:117] "RemoveContainer" containerID="b1abd27e2c2d388e6a6c1efd167095fea30180701b7db54c428f7e52cebd08ba" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.588453 4958 scope.go:117] "RemoveContainer" containerID="14dbc21dd417bde9c33ac37159b79364136213cf4be75108690ff918465fc7cc" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.614367 4958 scope.go:117] "RemoveContainer" containerID="4ae78c274c3294e9773e57bf2eec9477b3e77f5156d4259b5f42c1b6a1c6dd6a" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.638401 4958 scope.go:117] "RemoveContainer" containerID="f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.673585 4958 scope.go:117] "RemoveContainer" containerID="f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.674131 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9\": container with ID starting with f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9 not found: ID does not exist" containerID="f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.674165 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9"} err="failed to get container status \"f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9\": rpc error: code = NotFound desc = could not find container \"f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9\": container with ID starting with f2be5b060fb4b1a90fb8d93662e78d565cae6c9dc122f1a9fe9e8cac346ac6b9 not found: ID does not exist" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.799021 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:26:39 crc kubenswrapper[4958]: E1201 10:26:39.799472 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.813604 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03af271e-0af4-4681-a7b6-31b207d21143" path="/var/lib/kubelet/pods/03af271e-0af4-4681-a7b6-31b207d21143/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.814635 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113fc18e-8eb6-45a5-9625-1404f4d832c4" path="/var/lib/kubelet/pods/113fc18e-8eb6-45a5-9625-1404f4d832c4/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.817070 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" path="/var/lib/kubelet/pods/30e8c723-a7a0-4697-8369-bd224fcfdf3f/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.819023 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e6b589-937c-42c8-8004-49e39813d622" path="/var/lib/kubelet/pods/46e6b589-937c-42c8-8004-49e39813d622/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.819872 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" path="/var/lib/kubelet/pods/69ae2c66-9d6d-4bc0-b3fe-ee729225e85f/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.821257 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" path="/var/lib/kubelet/pods/81c0931c-8919-4128-97bb-21c5872d5cf0/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.821884 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" path="/var/lib/kubelet/pods/8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.822523 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" path="/var/lib/kubelet/pods/aa361886-e7eb-413c-a4b4-cedaf1c86983/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.824621 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" path="/var/lib/kubelet/pods/b727a711-6b0b-44c6-917a-602f10dd0d6c/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.825526 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf6f35f-9e82-4899-801f-3b5e94189f6d" path="/var/lib/kubelet/pods/caf6f35f-9e82-4899-801f-3b5e94189f6d/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.826573 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceca985e-bec2-42fe-9758-edad828586c2" path="/var/lib/kubelet/pods/ceca985e-bec2-42fe-9758-edad828586c2/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.828000 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13e880d-3817-4df9-8477-82349d7979b9" path="/var/lib/kubelet/pods/d13e880d-3817-4df9-8477-82349d7979b9/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.828896 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" path="/var/lib/kubelet/pods/e19ffea8-2e96-4cff-a2ec-40646aaa4cc0/volumes" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.886479 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4fccd607-3bfb-4593-a6de-6a0fc52b34ea","Type":"ContainerDied","Data":"b24aff2f4c8339a977a2885fd8d2bee815fddbb82c358417bca3aa703d17065f"} Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.886563 4958 scope.go:117] "RemoveContainer" containerID="79456c806f5f410ac0967e765960f7c80fe55e590cd1656e2879adefde292303" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.886734 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.907660 4958 generic.go:334] "Generic (PLEG): container finished" podID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerID="a6020adfcd7117df2c055f1f046c4057d506ced6a9066aeab5c10fdf00691823" exitCode=0 Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.907737 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerDied","Data":"a6020adfcd7117df2c055f1f046c4057d506ced6a9066aeab5c10fdf00691823"} Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.930727 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.949192 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 10:26:39 crc kubenswrapper[4958]: I1201 10:26:39.954506 4958 scope.go:117] "RemoveContainer" containerID="a87bd53133187b0ab3e009cb4dad6caf4c75502b45000066abe1770b7f35fabb" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.119406 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.190:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.210403 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305621 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-combined-ca-bundle\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305693 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-ceilometer-tls-certs\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305769 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-run-httpd\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305793 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-scripts\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305890 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4bbf\" (UniqueName: \"kubernetes.io/projected/89868f5c-dfd8-4619-9b7e-02a5b75916db-kube-api-access-c4bbf\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305932 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-log-httpd\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.305989 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-config-data\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.306044 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-sg-core-conf-yaml\") pod \"89868f5c-dfd8-4619-9b7e-02a5b75916db\" (UID: \"89868f5c-dfd8-4619-9b7e-02a5b75916db\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.306484 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.306957 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.311371 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89868f5c-dfd8-4619-9b7e-02a5b75916db-kube-api-access-c4bbf" (OuterVolumeSpecName: "kube-api-access-c4bbf") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "kube-api-access-c4bbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.314661 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-scripts" (OuterVolumeSpecName: "scripts") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.345295 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.367687 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.384132 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408262 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408309 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408324 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4bbf\" (UniqueName: \"kubernetes.io/projected/89868f5c-dfd8-4619-9b7e-02a5b75916db-kube-api-access-c4bbf\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408340 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89868f5c-dfd8-4619-9b7e-02a5b75916db-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408351 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408363 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.408375 4958 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.421212 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-config-data" (OuterVolumeSpecName: "config-data") pod "89868f5c-dfd8-4619-9b7e-02a5b75916db" (UID: "89868f5c-dfd8-4619-9b7e-02a5b75916db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.442387 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.444103 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5ffd4b498c-jxtxd" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.472830 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.511016 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89868f5c-dfd8-4619-9b7e-02a5b75916db-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.611741 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-combined-ca-bundle\") pod \"de0d4422-f9bb-4abf-8727-066813a9182e\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.611812 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-config-data\") pod \"de0d4422-f9bb-4abf-8727-066813a9182e\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.611941 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qw8h\" (UniqueName: \"kubernetes.io/projected/de0d4422-f9bb-4abf-8727-066813a9182e-kube-api-access-7qw8h\") pod \"de0d4422-f9bb-4abf-8727-066813a9182e\" (UID: \"de0d4422-f9bb-4abf-8727-066813a9182e\") " Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.616404 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de0d4422-f9bb-4abf-8727-066813a9182e-kube-api-access-7qw8h" (OuterVolumeSpecName: "kube-api-access-7qw8h") pod "de0d4422-f9bb-4abf-8727-066813a9182e" (UID: "de0d4422-f9bb-4abf-8727-066813a9182e"). InnerVolumeSpecName "kube-api-access-7qw8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.633948 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de0d4422-f9bb-4abf-8727-066813a9182e" (UID: "de0d4422-f9bb-4abf-8727-066813a9182e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.640933 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-config-data" (OuterVolumeSpecName: "config-data") pod "de0d4422-f9bb-4abf-8727-066813a9182e" (UID: "de0d4422-f9bb-4abf-8727-066813a9182e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.714035 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qw8h\" (UniqueName: \"kubernetes.io/projected/de0d4422-f9bb-4abf-8727-066813a9182e-kube-api-access-7qw8h\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.714099 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.714121 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de0d4422-f9bb-4abf-8727-066813a9182e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.947700 4958 generic.go:334] "Generic (PLEG): container finished" podID="de0d4422-f9bb-4abf-8727-066813a9182e" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" exitCode=0 Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.947784 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.947773 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de0d4422-f9bb-4abf-8727-066813a9182e","Type":"ContainerDied","Data":"058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a"} Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.947882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de0d4422-f9bb-4abf-8727-066813a9182e","Type":"ContainerDied","Data":"aacf6c67146e3ac1dc84d9564a6bca05321603499f5b30393501e9347b843b03"} Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.947912 4958 scope.go:117] "RemoveContainer" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.953222 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"89868f5c-dfd8-4619-9b7e-02a5b75916db","Type":"ContainerDied","Data":"4b04c97099c975e03c6bba9c7249d424fb6a89d33d51c4af29cee645cc680b1e"} Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.953352 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.987249 4958 scope.go:117] "RemoveContainer" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" Dec 01 10:26:40 crc kubenswrapper[4958]: E1201 10:26:40.996166 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a\": container with ID starting with 058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a not found: ID does not exist" containerID="058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.996273 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a"} err="failed to get container status \"058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a\": rpc error: code = NotFound desc = could not find container \"058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a\": container with ID starting with 058deee68ae35cac74169c2f8d75def62e5447aee109c42652d23307d321925a not found: ID does not exist" Dec 01 10:26:40 crc kubenswrapper[4958]: I1201 10:26:40.996314 4958 scope.go:117] "RemoveContainer" containerID="d2b247b18d958afcd723731638919280c2072270eb4c5c41496ef47a22a4f23a" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.006450 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.011323 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.023507 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.030223 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.030395 4958 scope.go:117] "RemoveContainer" containerID="f54faead50baed98cc76dab5ac5bc05271f5999149758dd0bd4a4895989a1b1b" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.056818 4958 scope.go:117] "RemoveContainer" containerID="a6020adfcd7117df2c055f1f046c4057d506ced6a9066aeab5c10fdf00691823" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.086265 4958 scope.go:117] "RemoveContainer" containerID="3fbf7d32df1fc1b787081dc2dd704d72d22c756d3674cb93e42c1d2472d4bef9" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.809563 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" path="/var/lib/kubelet/pods/4fccd607-3bfb-4593-a6de-6a0fc52b34ea/volumes" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.810260 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" path="/var/lib/kubelet/pods/89868f5c-dfd8-4619-9b7e-02a5b75916db/volumes" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.811516 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de0d4422-f9bb-4abf-8727-066813a9182e" path="/var/lib/kubelet/pods/de0d4422-f9bb-4abf-8727-066813a9182e/volumes" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.857094 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:26:41 crc kubenswrapper[4958]: I1201 10:26:41.857277 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": dial tcp 10.217.0.200:8775: i/o timeout" Dec 01 10:26:42 crc kubenswrapper[4958]: I1201 10:26:42.069223 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-744fbbd578-7c5pc" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:26:42 crc kubenswrapper[4958]: I1201 10:26:42.069357 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-744fbbd578-7c5pc" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.516434 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.517226 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.517579 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.517638 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.518379 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.521636 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.524281 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:44 crc kubenswrapper[4958]: E1201 10:26:44.524336 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.046367 4958 generic.go:334] "Generic (PLEG): container finished" podID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerID="7e4c8d5d756ae1f1f79baf6ab39a0885d96760214ce8bd8adfe3bf8d0265ce4e" exitCode=0 Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.046425 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-768454b56f-84xc8" event={"ID":"305e96f8-0597-4c64-9026-6d6f2aa454d4","Type":"ContainerDied","Data":"7e4c8d5d756ae1f1f79baf6ab39a0885d96760214ce8bd8adfe3bf8d0265ce4e"} Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.046782 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-768454b56f-84xc8" event={"ID":"305e96f8-0597-4c64-9026-6d6f2aa454d4","Type":"ContainerDied","Data":"dbfebfbd52c8c0ea998333ec0ccefde14e6c7bdf7bb724774781839cf2aaa114"} Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.046809 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbfebfbd52c8c0ea998333ec0ccefde14e6c7bdf7bb724774781839cf2aaa114" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.059774 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.150143 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-public-tls-certs\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.150555 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-httpd-config\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.150740 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-combined-ca-bundle\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.150943 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-config\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.151054 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzjbr\" (UniqueName: \"kubernetes.io/projected/305e96f8-0597-4c64-9026-6d6f2aa454d4-kube-api-access-lzjbr\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.151117 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-internal-tls-certs\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.151161 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-ovndb-tls-certs\") pod \"305e96f8-0597-4c64-9026-6d6f2aa454d4\" (UID: \"305e96f8-0597-4c64-9026-6d6f2aa454d4\") " Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.170114 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.174437 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305e96f8-0597-4c64-9026-6d6f2aa454d4-kube-api-access-lzjbr" (OuterVolumeSpecName: "kube-api-access-lzjbr") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "kube-api-access-lzjbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.197052 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.197446 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.198521 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-config" (OuterVolumeSpecName: "config") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.199416 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.251250 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "305e96f8-0597-4c64-9026-6d6f2aa454d4" (UID: "305e96f8-0597-4c64-9026-6d6f2aa454d4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254411 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254457 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254537 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254552 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzjbr\" (UniqueName: \"kubernetes.io/projected/305e96f8-0597-4c64-9026-6d6f2aa454d4-kube-api-access-lzjbr\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254565 4958 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254577 4958 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:47 crc kubenswrapper[4958]: I1201 10:26:47.254589 4958 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/305e96f8-0597-4c64-9026-6d6f2aa454d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:48 crc kubenswrapper[4958]: I1201 10:26:48.060732 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-768454b56f-84xc8" Dec 01 10:26:48 crc kubenswrapper[4958]: I1201 10:26:48.101606 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-768454b56f-84xc8"] Dec 01 10:26:48 crc kubenswrapper[4958]: I1201 10:26:48.122136 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-768454b56f-84xc8"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.517454 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.518567 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.518932 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.518985 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.520421 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.522659 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.524425 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:49 crc kubenswrapper[4958]: E1201 10:26:49.524497 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:26:49 crc kubenswrapper[4958]: I1201 10:26:49.817806 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" path="/var/lib/kubelet/pods/305e96f8-0597-4c64-9026-6d6f2aa454d4/volumes" Dec 01 10:26:53 crc kubenswrapper[4958]: I1201 10:26:53.805046 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:26:53 crc kubenswrapper[4958]: E1201 10:26:53.805924 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.516786 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.518049 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.518351 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.518601 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.518649 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.519864 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.522047 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 01 10:26:54 crc kubenswrapper[4958]: E1201 10:26:54.522095 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xr8kd" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:26:58 crc kubenswrapper[4958]: I1201 10:26:58.982527 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xr8kd_c01e3885-db48-42db-aa00-ca08c6839dbd/ovs-vswitchd/0.log" Dec 01 10:26:58 crc kubenswrapper[4958]: I1201 10:26:58.984662 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.078215 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c01e3885-db48-42db-aa00-ca08c6839dbd-scripts\") pod \"c01e3885-db48-42db-aa00-ca08c6839dbd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.078282 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-lib\") pod \"c01e3885-db48-42db-aa00-ca08c6839dbd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.078425 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-lib" (OuterVolumeSpecName: "var-lib") pod "c01e3885-db48-42db-aa00-ca08c6839dbd" (UID: "c01e3885-db48-42db-aa00-ca08c6839dbd"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079160 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhxws\" (UniqueName: \"kubernetes.io/projected/c01e3885-db48-42db-aa00-ca08c6839dbd-kube-api-access-fhxws\") pod \"c01e3885-db48-42db-aa00-ca08c6839dbd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079212 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-etc-ovs\") pod \"c01e3885-db48-42db-aa00-ca08c6839dbd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079249 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-log\") pod \"c01e3885-db48-42db-aa00-ca08c6839dbd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079379 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "c01e3885-db48-42db-aa00-ca08c6839dbd" (UID: "c01e3885-db48-42db-aa00-ca08c6839dbd"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079411 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-run\") pod \"c01e3885-db48-42db-aa00-ca08c6839dbd\" (UID: \"c01e3885-db48-42db-aa00-ca08c6839dbd\") " Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079468 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-run" (OuterVolumeSpecName: "var-run") pod "c01e3885-db48-42db-aa00-ca08c6839dbd" (UID: "c01e3885-db48-42db-aa00-ca08c6839dbd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.079485 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-log" (OuterVolumeSpecName: "var-log") pod "c01e3885-db48-42db-aa00-ca08c6839dbd" (UID: "c01e3885-db48-42db-aa00-ca08c6839dbd"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.080216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c01e3885-db48-42db-aa00-ca08c6839dbd-scripts" (OuterVolumeSpecName: "scripts") pod "c01e3885-db48-42db-aa00-ca08c6839dbd" (UID: "c01e3885-db48-42db-aa00-ca08c6839dbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.080452 4958 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.080481 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c01e3885-db48-42db-aa00-ca08c6839dbd-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.080490 4958 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-lib\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.080502 4958 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.080515 4958 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c01e3885-db48-42db-aa00-ca08c6839dbd-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.089080 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01e3885-db48-42db-aa00-ca08c6839dbd-kube-api-access-fhxws" (OuterVolumeSpecName: "kube-api-access-fhxws") pod "c01e3885-db48-42db-aa00-ca08c6839dbd" (UID: "c01e3885-db48-42db-aa00-ca08c6839dbd"). InnerVolumeSpecName "kube-api-access-fhxws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.181943 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhxws\" (UniqueName: \"kubernetes.io/projected/c01e3885-db48-42db-aa00-ca08c6839dbd-kube-api-access-fhxws\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.184635 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xr8kd_c01e3885-db48-42db-aa00-ca08c6839dbd/ovs-vswitchd/0.log" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.186023 4958 generic.go:334] "Generic (PLEG): container finished" podID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" exitCode=137 Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.186095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerDied","Data":"dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3"} Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.186123 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xr8kd" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.186177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xr8kd" event={"ID":"c01e3885-db48-42db-aa00-ca08c6839dbd","Type":"ContainerDied","Data":"424361937c57d12afb3581265890f622e71beeb3b40a6737a81c81ebb023671d"} Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.186208 4958 scope.go:117] "RemoveContainer" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.224963 4958 scope.go:117] "RemoveContainer" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.235752 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-xr8kd"] Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.245450 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-xr8kd"] Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.264993 4958 scope.go:117] "RemoveContainer" containerID="78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.298146 4958 scope.go:117] "RemoveContainer" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" Dec 01 10:26:59 crc kubenswrapper[4958]: E1201 10:26:59.299165 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3\": container with ID starting with dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3 not found: ID does not exist" containerID="dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.299288 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3"} err="failed to get container status \"dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3\": rpc error: code = NotFound desc = could not find container \"dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3\": container with ID starting with dbb23095e970b4272d23668034e613e2b2063f096460718996ddb2912b1d15d3 not found: ID does not exist" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.299362 4958 scope.go:117] "RemoveContainer" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" Dec 01 10:26:59 crc kubenswrapper[4958]: E1201 10:26:59.300367 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c\": container with ID starting with d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c not found: ID does not exist" containerID="d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.300413 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c"} err="failed to get container status \"d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c\": rpc error: code = NotFound desc = could not find container \"d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c\": container with ID starting with d18f479c7fedd5705b43b1aa676f8e51c5c20c9b97895104573b3b38f12b123c not found: ID does not exist" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.300458 4958 scope.go:117] "RemoveContainer" containerID="78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083" Dec 01 10:26:59 crc kubenswrapper[4958]: E1201 10:26:59.300797 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083\": container with ID starting with 78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083 not found: ID does not exist" containerID="78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.300869 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083"} err="failed to get container status \"78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083\": rpc error: code = NotFound desc = could not find container \"78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083\": container with ID starting with 78762812ba18d4b5f9cf919b541fba37920d013edeed6a8ec69eb0d463f33083 not found: ID does not exist" Dec 01 10:26:59 crc kubenswrapper[4958]: E1201 10:26:59.656775 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13dcdb95_560c_4cef_90d8_5716e9bccf57.slice/crio-conmon-2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:26:59 crc kubenswrapper[4958]: I1201 10:26:59.818106 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" path="/var/lib/kubelet/pods/c01e3885-db48-42db-aa00-ca08c6839dbd/volumes" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.021107 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.097389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") pod \"13dcdb95-560c-4cef-90d8-5716e9bccf57\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.097474 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-cache\") pod \"13dcdb95-560c-4cef-90d8-5716e9bccf57\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.097569 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn54v\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-kube-api-access-fn54v\") pod \"13dcdb95-560c-4cef-90d8-5716e9bccf57\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.097651 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-lock\") pod \"13dcdb95-560c-4cef-90d8-5716e9bccf57\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.097680 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"13dcdb95-560c-4cef-90d8-5716e9bccf57\" (UID: \"13dcdb95-560c-4cef-90d8-5716e9bccf57\") " Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.098328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-lock" (OuterVolumeSpecName: "lock") pod "13dcdb95-560c-4cef-90d8-5716e9bccf57" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.098586 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-cache" (OuterVolumeSpecName: "cache") pod "13dcdb95-560c-4cef-90d8-5716e9bccf57" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.104054 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "13dcdb95-560c-4cef-90d8-5716e9bccf57" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.105266 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-kube-api-access-fn54v" (OuterVolumeSpecName: "kube-api-access-fn54v") pod "13dcdb95-560c-4cef-90d8-5716e9bccf57" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57"). InnerVolumeSpecName "kube-api-access-fn54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.108028 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "13dcdb95-560c-4cef-90d8-5716e9bccf57" (UID: "13dcdb95-560c-4cef-90d8-5716e9bccf57"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.199642 4958 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.200230 4958 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-cache\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.200333 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn54v\" (UniqueName: \"kubernetes.io/projected/13dcdb95-560c-4cef-90d8-5716e9bccf57-kube-api-access-fn54v\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.200424 4958 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/13dcdb95-560c-4cef-90d8-5716e9bccf57-lock\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.200563 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.206127 4958 generic.go:334] "Generic (PLEG): container finished" podID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerID="2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573" exitCode=137 Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.206280 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573"} Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.206371 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"13dcdb95-560c-4cef-90d8-5716e9bccf57","Type":"ContainerDied","Data":"1c1b40356587456bb51e3ad44e98d58f98fcfc84e252bf116e2742fafa3e7267"} Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.206468 4958 scope.go:117] "RemoveContainer" containerID="2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.206777 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.220566 4958 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.256897 4958 scope.go:117] "RemoveContainer" containerID="f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.258567 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.266523 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.283892 4958 scope.go:117] "RemoveContainer" containerID="95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.301983 4958 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.311396 4958 scope.go:117] "RemoveContainer" containerID="4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.334761 4958 scope.go:117] "RemoveContainer" containerID="e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.363908 4958 scope.go:117] "RemoveContainer" containerID="c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.389392 4958 scope.go:117] "RemoveContainer" containerID="45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.412979 4958 scope.go:117] "RemoveContainer" containerID="52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.434306 4958 scope.go:117] "RemoveContainer" containerID="c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.458616 4958 scope.go:117] "RemoveContainer" containerID="f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.481466 4958 scope.go:117] "RemoveContainer" containerID="865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.509357 4958 scope.go:117] "RemoveContainer" containerID="52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.534547 4958 scope.go:117] "RemoveContainer" containerID="01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.555998 4958 scope.go:117] "RemoveContainer" containerID="57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.581681 4958 scope.go:117] "RemoveContainer" containerID="e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.606151 4958 scope.go:117] "RemoveContainer" containerID="2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.607032 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573\": container with ID starting with 2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573 not found: ID does not exist" containerID="2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.607095 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573"} err="failed to get container status \"2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573\": rpc error: code = NotFound desc = could not find container \"2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573\": container with ID starting with 2b1bd8bfd2b7718248c0a4322bcf007f21088eaf6f456a3a5abdc4aa9dd4a573 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.607135 4958 scope.go:117] "RemoveContainer" containerID="f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.607804 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609\": container with ID starting with f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609 not found: ID does not exist" containerID="f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.607887 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609"} err="failed to get container status \"f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609\": rpc error: code = NotFound desc = could not find container \"f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609\": container with ID starting with f29a065b0ddf26169aabe7ae6ad8b6d3377b8f3389795449dc71e2742353d609 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.607929 4958 scope.go:117] "RemoveContainer" containerID="95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.608338 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b\": container with ID starting with 95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b not found: ID does not exist" containerID="95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.608373 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b"} err="failed to get container status \"95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b\": rpc error: code = NotFound desc = could not find container \"95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b\": container with ID starting with 95afb3abf4bac2c438ad62074b66882bdaafff7f586ba8a6d0a76121f279399b not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.608395 4958 scope.go:117] "RemoveContainer" containerID="4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.608690 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a\": container with ID starting with 4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a not found: ID does not exist" containerID="4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.608723 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a"} err="failed to get container status \"4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a\": rpc error: code = NotFound desc = could not find container \"4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a\": container with ID starting with 4200bda3ba54fd990388524d97e1f6fdcba56a926272e8d50c30f8f1b903d07a not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.608738 4958 scope.go:117] "RemoveContainer" containerID="e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.609181 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84\": container with ID starting with e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84 not found: ID does not exist" containerID="e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.609207 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84"} err="failed to get container status \"e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84\": rpc error: code = NotFound desc = could not find container \"e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84\": container with ID starting with e6b9b28943742935972d97a1f62f5f52e2fde01492530eaa77b9b898fd2c9d84 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.609221 4958 scope.go:117] "RemoveContainer" containerID="c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.609735 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b\": container with ID starting with c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b not found: ID does not exist" containerID="c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.609771 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b"} err="failed to get container status \"c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b\": rpc error: code = NotFound desc = could not find container \"c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b\": container with ID starting with c3c204443ad6b610f9de1fb0f8c5342451fbd72eb746d9e8e03f81b85991f68b not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.609790 4958 scope.go:117] "RemoveContainer" containerID="45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.610431 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457\": container with ID starting with 45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457 not found: ID does not exist" containerID="45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.610475 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457"} err="failed to get container status \"45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457\": rpc error: code = NotFound desc = could not find container \"45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457\": container with ID starting with 45a6973b5ad0742944d3a43778b05e2378e4234af599c52ee4c8711abdb4a457 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.610502 4958 scope.go:117] "RemoveContainer" containerID="52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.611422 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e\": container with ID starting with 52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e not found: ID does not exist" containerID="52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.611465 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e"} err="failed to get container status \"52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e\": rpc error: code = NotFound desc = could not find container \"52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e\": container with ID starting with 52f003c4452815db13106faf9941e71f61b166933018883cfa708153b71da78e not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.611492 4958 scope.go:117] "RemoveContainer" containerID="c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.611868 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d\": container with ID starting with c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d not found: ID does not exist" containerID="c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.611929 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d"} err="failed to get container status \"c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d\": rpc error: code = NotFound desc = could not find container \"c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d\": container with ID starting with c0de0fa89c374e451fd344254fa736a0131c4473c5d9730472766afd7c8bfd5d not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.611965 4958 scope.go:117] "RemoveContainer" containerID="f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.612373 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789\": container with ID starting with f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789 not found: ID does not exist" containerID="f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.612412 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789"} err="failed to get container status \"f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789\": rpc error: code = NotFound desc = could not find container \"f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789\": container with ID starting with f33975248a4564b556aea0c21d6dc5782d263058390056a1d274154337979789 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.612432 4958 scope.go:117] "RemoveContainer" containerID="865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.612837 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6\": container with ID starting with 865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6 not found: ID does not exist" containerID="865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.612901 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6"} err="failed to get container status \"865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6\": rpc error: code = NotFound desc = could not find container \"865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6\": container with ID starting with 865cc8ca1f68b1fb3bd9007f63e5fd084c77e4b6a22f2df02a8a5b900bbf54d6 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.612925 4958 scope.go:117] "RemoveContainer" containerID="52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.613223 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0\": container with ID starting with 52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0 not found: ID does not exist" containerID="52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.613275 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0"} err="failed to get container status \"52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0\": rpc error: code = NotFound desc = could not find container \"52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0\": container with ID starting with 52603107e98f8df2078258e1d982b451d8d71b96f8736d24999132d033bea9e0 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.613298 4958 scope.go:117] "RemoveContainer" containerID="01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.613821 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3\": container with ID starting with 01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3 not found: ID does not exist" containerID="01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.613873 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3"} err="failed to get container status \"01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3\": rpc error: code = NotFound desc = could not find container \"01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3\": container with ID starting with 01c79ab63cc80d37a6df7c9d7d52b71938ac0e9677a2f6651e5410a4bf18f8d3 not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.613906 4958 scope.go:117] "RemoveContainer" containerID="57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.614250 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b\": container with ID starting with 57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b not found: ID does not exist" containerID="57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.614295 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b"} err="failed to get container status \"57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b\": rpc error: code = NotFound desc = could not find container \"57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b\": container with ID starting with 57bd52c9ca652b19ed4e33a202a232c60bfa9768c47969bd45f0322238f2ed9b not found: ID does not exist" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.614315 4958 scope.go:117] "RemoveContainer" containerID="e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef" Dec 01 10:27:00 crc kubenswrapper[4958]: E1201 10:27:00.614748 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef\": container with ID starting with e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef not found: ID does not exist" containerID="e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef" Dec 01 10:27:00 crc kubenswrapper[4958]: I1201 10:27:00.614788 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef"} err="failed to get container status \"e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef\": rpc error: code = NotFound desc = could not find container \"e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef\": container with ID starting with e259bff5027b46b6780434588aaaf37284c0af3fc24553b6ef8f973ea6ddfcef not found: ID does not exist" Dec 01 10:27:01 crc kubenswrapper[4958]: I1201 10:27:01.808445 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" path="/var/lib/kubelet/pods/13dcdb95-560c-4cef-90d8-5716e9bccf57/volumes" Dec 01 10:27:05 crc kubenswrapper[4958]: I1201 10:27:05.442719 4958 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbed7f6be-9254-406b-9ed4-3fff3b2eb531"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbed7f6be-9254-406b-9ed4-3fff3b2eb531] : Timed out while waiting for systemd to remove kubepods-besteffort-podbed7f6be_9254_406b_9ed4_3fff3b2eb531.slice" Dec 01 10:27:05 crc kubenswrapper[4958]: E1201 10:27:05.443255 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podbed7f6be-9254-406b-9ed4-3fff3b2eb531] : unable to destroy cgroup paths for cgroup [kubepods besteffort podbed7f6be-9254-406b-9ed4-3fff3b2eb531] : Timed out while waiting for systemd to remove kubepods-besteffort-podbed7f6be_9254_406b_9ed4_3fff3b2eb531.slice" pod="openstack/openstack-cell1-galera-0" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.224568 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hnc5s"] Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225055 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225083 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225128 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="sg-core" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225139 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="sg-core" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225151 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66469344-8c32-45d9-afc4-91dcb9dbe807" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225163 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="66469344-8c32-45d9-afc4-91dcb9dbe807" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225178 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225190 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225206 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerName="mysql-bootstrap" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225214 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerName="mysql-bootstrap" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225224 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225231 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225250 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e6b589-937c-42c8-8004-49e39813d622" containerName="galera" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225257 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e6b589-937c-42c8-8004-49e39813d622" containerName="galera" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225273 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225282 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225294 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225304 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225314 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20641f2-49c0-4492-8f1f-20b14a1f3bd3" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225323 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20641f2-49c0-4492-8f1f-20b14a1f3bd3" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225341 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server-init" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225349 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server-init" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225363 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03af271e-0af4-4681-a7b6-31b207d21143" containerName="memcached" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225372 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="03af271e-0af4-4681-a7b6-31b207d21143" containerName="memcached" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225382 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225389 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225402 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225410 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225419 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbb9949-a0ed-4f1a-805c-fedba7ec3f1b" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225427 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbb9949-a0ed-4f1a-805c-fedba7ec3f1b" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225441 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225449 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-api" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225464 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225472 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-api" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225483 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225491 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225503 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerName="setup-container" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225511 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerName="setup-container" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225525 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225533 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225543 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225551 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225562 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225570 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225579 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" containerName="nova-cell1-conductor-conductor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225587 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" containerName="nova-cell1-conductor-conductor" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225597 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225606 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225620 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225628 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225639 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" containerName="kube-state-metrics" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225648 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" containerName="kube-state-metrics" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225664 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225671 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225685 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf6f35f-9e82-4899-801f-3b5e94189f6d" containerName="keystone-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225693 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf6f35f-9e82-4899-801f-3b5e94189f6d" containerName="keystone-api" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225707 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="cinder-scheduler" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225714 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="cinder-scheduler" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225725 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225734 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225748 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="swift-recon-cron" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225755 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="swift-recon-cron" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225767 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerName="rabbitmq" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225776 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerName="rabbitmq" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225786 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225796 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225810 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225817 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225825 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7f1749-5fab-4187-8675-01747669c1b7" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225835 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7f1749-5fab-4187-8675-01747669c1b7" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225871 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225881 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225897 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225906 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225917 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerName="setup-container" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225925 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerName="setup-container" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225941 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225950 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225966 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225975 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.225984 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.225992 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-api" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226002 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-central-agent" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226011 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-central-agent" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226027 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226035 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226048 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226058 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226069 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="rsync" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226076 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="rsync" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226084 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e6b589-937c-42c8-8004-49e39813d622" containerName="mysql-bootstrap" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226093 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e6b589-937c-42c8-8004-49e39813d622" containerName="mysql-bootstrap" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226106 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226114 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-server" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226125 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226133 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-server" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226143 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de0d4422-f9bb-4abf-8727-066813a9182e" containerName="nova-scheduler-scheduler" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226151 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="de0d4422-f9bb-4abf-8727-066813a9182e" containerName="nova-scheduler-scheduler" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226167 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226177 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-server" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226187 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-expirer" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226195 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-expirer" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226209 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-updater" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226217 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-updater" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226232 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7296e8da-30a1-4c69-978f-3411bda327f7" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226241 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7296e8da-30a1-4c69-978f-3411bda327f7" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226251 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226259 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226274 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e18a1be-f832-499c-a518-50dad90cafc9" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226284 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e18a1be-f832-499c-a518-50dad90cafc9" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226298 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226305 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-server" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226313 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226321 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226333 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="proxy-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226340 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="proxy-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226352 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerName="galera" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226359 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerName="galera" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226370 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226378 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226391 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226400 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226414 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226421 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226430 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113fc18e-8eb6-45a5-9625-1404f4d832c4" containerName="nova-cell0-conductor-conductor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226438 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="113fc18e-8eb6-45a5-9625-1404f4d832c4" containerName="nova-cell0-conductor-conductor" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226449 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-notification-agent" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226456 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-notification-agent" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226470 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34129d60-4706-41fe-aa19-f2a13f38713a" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226478 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="34129d60-4706-41fe-aa19-f2a13f38713a" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226492 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-reaper" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226500 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-reaper" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226509 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerName="rabbitmq" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226517 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerName="rabbitmq" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226526 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="ovsdbserver-nb" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226536 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="ovsdbserver-nb" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226546 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-updater" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226555 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-updater" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226568 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226576 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226587 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="probe" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226594 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="probe" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226606 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226614 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226624 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="ovsdbserver-sb" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226631 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="ovsdbserver-sb" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226644 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226652 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: E1201 10:27:06.226663 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226671 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226872 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226890 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226903 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ce268c-ffb9-4eae-93e5-23a10ba96185" containerName="kube-state-metrics" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226914 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226925 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-updater" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226935 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-expirer" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226949 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc45b08-3fb8-4b79-8c8c-665a8f87f3cc" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226960 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fccd607-3bfb-4593-a6de-6a0fc52b34ea" containerName="rabbitmq" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226974 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.226986 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="ovn-northd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227000 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227015 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="cinder-scheduler" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227024 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b727a711-6b0b-44c6-917a-602f10dd0d6c" containerName="barbican-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227034 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227045 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" containerName="galera" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227058 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="de0d4422-f9bb-4abf-8727-066813a9182e" containerName="nova-scheduler-scheduler" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227068 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7296e8da-30a1-4c69-978f-3411bda327f7" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227078 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227089 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e18a1be-f832-499c-a518-50dad90cafc9" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227099 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227112 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227124 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7f1749-5fab-4187-8675-01747669c1b7" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227134 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa361886-e7eb-413c-a4b4-cedaf1c86983" containerName="probe" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227141 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0353bee2-4033-4493-9217-b5c4600d3d90" containerName="glance-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227151 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e6b589-937c-42c8-8004-49e39813d622" containerName="galera" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227164 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ea0b7c-fb6b-4802-9fff-4e3995c0ed14" containerName="nova-metadata-metadata" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227174 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227187 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227199 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="proxy-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227211 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227224 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-reaper" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227234 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="sg-core" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227249 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="207148af-4b76-49b6-80cc-883ec14bb268" containerName="placement-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227262 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93b36ff-312b-40e1-9e3a-b1981800da66" containerName="proxy-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227272 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227283 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-httpd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227295 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-notification-agent" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227303 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19ffea8-2e96-4cff-a2ec-40646aaa4cc0" containerName="rabbitmq" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227312 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="113fc18e-8eb6-45a5-9625-1404f4d832c4" containerName="nova-cell0-conductor-conductor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227326 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="ovsdbserver-sb" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227335 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-replicator" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227344 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ae2c66-9d6d-4bc0-b3fe-ee729225e85f" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227356 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227365 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="34129d60-4706-41fe-aa19-f2a13f38713a" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227373 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-auditor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227382 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="debf8fae-f15c-4b09-b185-3f47c7e0491b" containerName="openstack-network-exporter" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227394 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbb9949-a0ed-4f1a-805c-fedba7ec3f1b" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227407 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20641f2-49c0-4492-8f1f-20b14a1f3bd3" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227417 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="object-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227428 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="swift-recon-cron" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227440 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227451 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="032af861-fde8-4b7a-929b-2ec7f5871474" containerName="ovsdbserver-nb" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227465 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="305e96f8-0597-4c64-9026-6d6f2aa454d4" containerName="neutron-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227476 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="89868f5c-dfd8-4619-9b7e-02a5b75916db" containerName="ceilometer-central-agent" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227486 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="03af271e-0af4-4681-a7b6-31b207d21143" containerName="memcached" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227495 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="rsync" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227505 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13e880d-3817-4df9-8477-82349d7979b9" containerName="ovn-controller" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227513 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c0931c-8919-4128-97bb-21c5872d5cf0" containerName="barbican-worker" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227523 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227535 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="account-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227543 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf6f35f-9e82-4899-801f-3b5e94189f6d" containerName="keystone-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227551 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8477d4fe-3ea3-4bc3-a5e8-5a1aa6951ca6" containerName="nova-cell1-conductor-conductor" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227565 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovsdb-server" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227575 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="137d864e-34d9-452c-91b0-179a93198b0f" containerName="cinder-api" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227584 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="66469344-8c32-45d9-afc4-91dcb9dbe807" containerName="mariadb-account-delete" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227593 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceca985e-bec2-42fe-9758-edad828586c2" containerName="barbican-keystone-listener-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227606 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dcdb95-560c-4cef-90d8-5716e9bccf57" containerName="container-updater" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227614 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e8c723-a7a0-4697-8369-bd224fcfdf3f" containerName="glance-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227625 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01e3885-db48-42db-aa00-ca08c6839dbd" containerName="ovs-vswitchd" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.227633 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" containerName="nova-api-log" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.228997 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.247825 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnc5s"] Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.278878 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.302183 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqc68\" (UniqueName: \"kubernetes.io/projected/1d575277-7cb1-4ca5-b045-5826949b1312-kube-api-access-lqc68\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.302276 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-catalog-content\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.302558 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-utilities\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.321990 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.328384 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.405746 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-utilities\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.405887 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqc68\" (UniqueName: \"kubernetes.io/projected/1d575277-7cb1-4ca5-b045-5826949b1312-kube-api-access-lqc68\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.405925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-catalog-content\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.406598 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-catalog-content\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.406879 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-utilities\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.435113 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqc68\" (UniqueName: \"kubernetes.io/projected/1d575277-7cb1-4ca5-b045-5826949b1312-kube-api-access-lqc68\") pod \"community-operators-hnc5s\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:06 crc kubenswrapper[4958]: I1201 10:27:06.562549 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:07 crc kubenswrapper[4958]: I1201 10:27:07.103375 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnc5s"] Dec 01 10:27:07 crc kubenswrapper[4958]: I1201 10:27:07.297926 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerStarted","Data":"4b4894bb450e72c609b7a4be988eb41ce4c213cf5ea07d8e649723de5c9db022"} Dec 01 10:27:07 crc kubenswrapper[4958]: I1201 10:27:07.797549 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:27:07 crc kubenswrapper[4958]: E1201 10:27:07.798129 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:27:07 crc kubenswrapper[4958]: I1201 10:27:07.809694 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed7f6be-9254-406b-9ed4-3fff3b2eb531" path="/var/lib/kubelet/pods/bed7f6be-9254-406b-9ed4-3fff3b2eb531/volumes" Dec 01 10:27:08 crc kubenswrapper[4958]: I1201 10:27:08.310016 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d575277-7cb1-4ca5-b045-5826949b1312" containerID="4b9ce5d13565114465e5b0c46364b6b86b2504a4a149ee94b8361a275e10ab29" exitCode=0 Dec 01 10:27:08 crc kubenswrapper[4958]: I1201 10:27:08.310174 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerDied","Data":"4b9ce5d13565114465e5b0c46364b6b86b2504a4a149ee94b8361a275e10ab29"} Dec 01 10:27:08 crc kubenswrapper[4958]: I1201 10:27:08.679284 4958 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod79797aa2-db7b-429a-91d9-3e181de3976c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod79797aa2-db7b-429a-91d9-3e181de3976c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod79797aa2_db7b_429a_91d9_3e181de3976c.slice" Dec 01 10:27:08 crc kubenswrapper[4958]: E1201 10:27:08.679378 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod79797aa2-db7b-429a-91d9-3e181de3976c] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod79797aa2-db7b-429a-91d9-3e181de3976c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod79797aa2_db7b_429a_91d9_3e181de3976c.slice" pod="openstack/nova-api-0" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" Dec 01 10:27:09 crc kubenswrapper[4958]: I1201 10:27:09.323164 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerStarted","Data":"33dd6d6bbd8405d95aab9ffd83ac079871858c397f53dc6c3926e1a78ce2145e"} Dec 01 10:27:09 crc kubenswrapper[4958]: I1201 10:27:09.323193 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 10:27:09 crc kubenswrapper[4958]: I1201 10:27:09.384459 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:27:09 crc kubenswrapper[4958]: I1201 10:27:09.390986 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 10:27:09 crc kubenswrapper[4958]: I1201 10:27:09.810147 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79797aa2-db7b-429a-91d9-3e181de3976c" path="/var/lib/kubelet/pods/79797aa2-db7b-429a-91d9-3e181de3976c/volumes" Dec 01 10:27:10 crc kubenswrapper[4958]: I1201 10:27:10.336298 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d575277-7cb1-4ca5-b045-5826949b1312" containerID="33dd6d6bbd8405d95aab9ffd83ac079871858c397f53dc6c3926e1a78ce2145e" exitCode=0 Dec 01 10:27:10 crc kubenswrapper[4958]: I1201 10:27:10.336384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerDied","Data":"33dd6d6bbd8405d95aab9ffd83ac079871858c397f53dc6c3926e1a78ce2145e"} Dec 01 10:27:11 crc kubenswrapper[4958]: I1201 10:27:11.349323 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerStarted","Data":"210ccaf766236fb920006e014e3f2a4ceb0f5468ef0527a6676e6bd01023c10c"} Dec 01 10:27:11 crc kubenswrapper[4958]: I1201 10:27:11.376679 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hnc5s" podStartSLOduration=2.894711937 podStartE2EDuration="5.376653076s" podCreationTimestamp="2025-12-01 10:27:06 +0000 UTC" firstStartedPulling="2025-12-01 10:27:08.312929317 +0000 UTC m=+1675.821718354" lastFinishedPulling="2025-12-01 10:27:10.794870456 +0000 UTC m=+1678.303659493" observedRunningTime="2025-12-01 10:27:11.369675545 +0000 UTC m=+1678.878464582" watchObservedRunningTime="2025-12-01 10:27:11.376653076 +0000 UTC m=+1678.885442113" Dec 01 10:27:16 crc kubenswrapper[4958]: I1201 10:27:16.563611 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:16 crc kubenswrapper[4958]: I1201 10:27:16.565681 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:16 crc kubenswrapper[4958]: I1201 10:27:16.611693 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:17 crc kubenswrapper[4958]: I1201 10:27:17.471874 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:17 crc kubenswrapper[4958]: I1201 10:27:17.523632 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnc5s"] Dec 01 10:27:19 crc kubenswrapper[4958]: I1201 10:27:19.431889 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hnc5s" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="registry-server" containerID="cri-o://210ccaf766236fb920006e014e3f2a4ceb0f5468ef0527a6676e6bd01023c10c" gracePeriod=2 Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.446143 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d575277-7cb1-4ca5-b045-5826949b1312" containerID="210ccaf766236fb920006e014e3f2a4ceb0f5468ef0527a6676e6bd01023c10c" exitCode=0 Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.446193 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerDied","Data":"210ccaf766236fb920006e014e3f2a4ceb0f5468ef0527a6676e6bd01023c10c"} Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.559430 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.669805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-catalog-content\") pod \"1d575277-7cb1-4ca5-b045-5826949b1312\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.670112 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqc68\" (UniqueName: \"kubernetes.io/projected/1d575277-7cb1-4ca5-b045-5826949b1312-kube-api-access-lqc68\") pod \"1d575277-7cb1-4ca5-b045-5826949b1312\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.670175 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-utilities\") pod \"1d575277-7cb1-4ca5-b045-5826949b1312\" (UID: \"1d575277-7cb1-4ca5-b045-5826949b1312\") " Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.671427 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-utilities" (OuterVolumeSpecName: "utilities") pod "1d575277-7cb1-4ca5-b045-5826949b1312" (UID: "1d575277-7cb1-4ca5-b045-5826949b1312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.676506 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d575277-7cb1-4ca5-b045-5826949b1312-kube-api-access-lqc68" (OuterVolumeSpecName: "kube-api-access-lqc68") pod "1d575277-7cb1-4ca5-b045-5826949b1312" (UID: "1d575277-7cb1-4ca5-b045-5826949b1312"). InnerVolumeSpecName "kube-api-access-lqc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.735594 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d575277-7cb1-4ca5-b045-5826949b1312" (UID: "1d575277-7cb1-4ca5-b045-5826949b1312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.773111 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqc68\" (UniqueName: \"kubernetes.io/projected/1d575277-7cb1-4ca5-b045-5826949b1312-kube-api-access-lqc68\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.773166 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:20 crc kubenswrapper[4958]: I1201 10:27:20.773178 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d575277-7cb1-4ca5-b045-5826949b1312-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.460541 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnc5s" event={"ID":"1d575277-7cb1-4ca5-b045-5826949b1312","Type":"ContainerDied","Data":"4b4894bb450e72c609b7a4be988eb41ce4c213cf5ea07d8e649723de5c9db022"} Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.460592 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnc5s" Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.460619 4958 scope.go:117] "RemoveContainer" containerID="210ccaf766236fb920006e014e3f2a4ceb0f5468ef0527a6676e6bd01023c10c" Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.489671 4958 scope.go:117] "RemoveContainer" containerID="33dd6d6bbd8405d95aab9ffd83ac079871858c397f53dc6c3926e1a78ce2145e" Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.512442 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnc5s"] Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.522745 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hnc5s"] Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.523876 4958 scope.go:117] "RemoveContainer" containerID="4b9ce5d13565114465e5b0c46364b6b86b2504a4a149ee94b8361a275e10ab29" Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.798312 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:27:21 crc kubenswrapper[4958]: E1201 10:27:21.798830 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:27:21 crc kubenswrapper[4958]: I1201 10:27:21.820836 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" path="/var/lib/kubelet/pods/1d575277-7cb1-4ca5-b045-5826949b1312/volumes" Dec 01 10:27:36 crc kubenswrapper[4958]: I1201 10:27:36.797465 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:27:36 crc kubenswrapper[4958]: E1201 10:27:36.798687 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:27:48 crc kubenswrapper[4958]: I1201 10:27:48.798054 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:27:48 crc kubenswrapper[4958]: E1201 10:27:48.798772 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.922468 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vvkvx"] Dec 01 10:27:49 crc kubenswrapper[4958]: E1201 10:27:49.923859 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="extract-utilities" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.923881 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="extract-utilities" Dec 01 10:27:49 crc kubenswrapper[4958]: E1201 10:27:49.923909 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="extract-content" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.924098 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="extract-content" Dec 01 10:27:49 crc kubenswrapper[4958]: E1201 10:27:49.924117 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="registry-server" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.924126 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="registry-server" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.924381 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d575277-7cb1-4ca5-b045-5826949b1312" containerName="registry-server" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.926747 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:49 crc kubenswrapper[4958]: I1201 10:27:49.939239 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvkvx"] Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.093870 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-utilities\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.093966 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5l75\" (UniqueName: \"kubernetes.io/projected/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-kube-api-access-h5l75\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.094001 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-catalog-content\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.195661 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-utilities\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.195733 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5l75\" (UniqueName: \"kubernetes.io/projected/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-kube-api-access-h5l75\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.195770 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-catalog-content\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.196606 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-utilities\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.196683 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-catalog-content\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.218972 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5l75\" (UniqueName: \"kubernetes.io/projected/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-kube-api-access-h5l75\") pod \"redhat-marketplace-vvkvx\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.256250 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:27:50 crc kubenswrapper[4958]: I1201 10:27:50.780713 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvkvx"] Dec 01 10:27:51 crc kubenswrapper[4958]: I1201 10:27:51.803817 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerID="80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28" exitCode=0 Dec 01 10:27:51 crc kubenswrapper[4958]: I1201 10:27:51.816006 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvkvx" event={"ID":"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56","Type":"ContainerDied","Data":"80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28"} Dec 01 10:27:51 crc kubenswrapper[4958]: I1201 10:27:51.816075 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvkvx" event={"ID":"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56","Type":"ContainerStarted","Data":"e0e6b61d50dbfe161eb4ca0f95e5da6e499467116b20f49668e0b2d94b9e9a25"} Dec 01 10:27:52 crc kubenswrapper[4958]: I1201 10:27:52.823702 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerID="2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b" exitCode=0 Dec 01 10:27:52 crc kubenswrapper[4958]: I1201 10:27:52.824189 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvkvx" event={"ID":"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56","Type":"ContainerDied","Data":"2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b"} Dec 01 10:27:53 crc kubenswrapper[4958]: I1201 10:27:53.840270 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvkvx" event={"ID":"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56","Type":"ContainerStarted","Data":"e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868"} Dec 01 10:27:53 crc kubenswrapper[4958]: I1201 10:27:53.865675 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vvkvx" podStartSLOduration=3.173818453 podStartE2EDuration="4.865648033s" podCreationTimestamp="2025-12-01 10:27:49 +0000 UTC" firstStartedPulling="2025-12-01 10:27:51.808095986 +0000 UTC m=+1719.316885023" lastFinishedPulling="2025-12-01 10:27:53.499925566 +0000 UTC m=+1721.008714603" observedRunningTime="2025-12-01 10:27:53.864484509 +0000 UTC m=+1721.373273556" watchObservedRunningTime="2025-12-01 10:27:53.865648033 +0000 UTC m=+1721.374437090" Dec 01 10:28:00 crc kubenswrapper[4958]: I1201 10:28:00.257638 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:28:00 crc kubenswrapper[4958]: I1201 10:28:00.258689 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:28:00 crc kubenswrapper[4958]: I1201 10:28:00.306300 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:28:00 crc kubenswrapper[4958]: I1201 10:28:00.960399 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:28:01 crc kubenswrapper[4958]: I1201 10:28:01.017587 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvkvx"] Dec 01 10:28:01 crc kubenswrapper[4958]: I1201 10:28:01.797671 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:28:01 crc kubenswrapper[4958]: E1201 10:28:01.798428 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:28:02 crc kubenswrapper[4958]: I1201 10:28:02.933737 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vvkvx" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="registry-server" containerID="cri-o://e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868" gracePeriod=2 Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.906932 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.952164 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerID="e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868" exitCode=0 Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.952229 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvkvx" event={"ID":"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56","Type":"ContainerDied","Data":"e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868"} Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.952245 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvkvx" Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.952278 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvkvx" event={"ID":"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56","Type":"ContainerDied","Data":"e0e6b61d50dbfe161eb4ca0f95e5da6e499467116b20f49668e0b2d94b9e9a25"} Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.952305 4958 scope.go:117] "RemoveContainer" containerID="e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868" Dec 01 10:28:03 crc kubenswrapper[4958]: I1201 10:28:03.978206 4958 scope.go:117] "RemoveContainer" containerID="2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.002999 4958 scope.go:117] "RemoveContainer" containerID="80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.027254 4958 scope.go:117] "RemoveContainer" containerID="e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868" Dec 01 10:28:04 crc kubenswrapper[4958]: E1201 10:28:04.028032 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868\": container with ID starting with e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868 not found: ID does not exist" containerID="e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.028103 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868"} err="failed to get container status \"e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868\": rpc error: code = NotFound desc = could not find container \"e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868\": container with ID starting with e3fcfe4d379112530f3eee83a635162543c2569ebd5c31bf6aad46d7a8b66868 not found: ID does not exist" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.028141 4958 scope.go:117] "RemoveContainer" containerID="2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b" Dec 01 10:28:04 crc kubenswrapper[4958]: E1201 10:28:04.028693 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b\": container with ID starting with 2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b not found: ID does not exist" containerID="2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.028742 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b"} err="failed to get container status \"2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b\": rpc error: code = NotFound desc = could not find container \"2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b\": container with ID starting with 2e2509fb00ada5a5f63ec1a8ea8189c5f99cc9f1186c713de65fb0bd14bcc97b not found: ID does not exist" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.028772 4958 scope.go:117] "RemoveContainer" containerID="80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28" Dec 01 10:28:04 crc kubenswrapper[4958]: E1201 10:28:04.029351 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28\": container with ID starting with 80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28 not found: ID does not exist" containerID="80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.029406 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28"} err="failed to get container status \"80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28\": rpc error: code = NotFound desc = could not find container \"80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28\": container with ID starting with 80336645b41bf6749d91e65d58b4cd81c14ac59d4ab9a4b952b6a7bd07905c28 not found: ID does not exist" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.055222 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-utilities\") pod \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.055405 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-catalog-content\") pod \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.055510 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5l75\" (UniqueName: \"kubernetes.io/projected/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-kube-api-access-h5l75\") pod \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\" (UID: \"1d06ccad-c540-4235-9bbe-a3fd3d6ebd56\") " Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.056351 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-utilities" (OuterVolumeSpecName: "utilities") pod "1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" (UID: "1d06ccad-c540-4235-9bbe-a3fd3d6ebd56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.064209 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-kube-api-access-h5l75" (OuterVolumeSpecName: "kube-api-access-h5l75") pod "1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" (UID: "1d06ccad-c540-4235-9bbe-a3fd3d6ebd56"). InnerVolumeSpecName "kube-api-access-h5l75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.080537 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" (UID: "1d06ccad-c540-4235-9bbe-a3fd3d6ebd56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.157769 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.157862 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5l75\" (UniqueName: \"kubernetes.io/projected/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-kube-api-access-h5l75\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.157881 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.295472 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvkvx"] Dec 01 10:28:04 crc kubenswrapper[4958]: I1201 10:28:04.301567 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvkvx"] Dec 01 10:28:05 crc kubenswrapper[4958]: I1201 10:28:05.808212 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" path="/var/lib/kubelet/pods/1d06ccad-c540-4235-9bbe-a3fd3d6ebd56/volumes" Dec 01 10:28:13 crc kubenswrapper[4958]: I1201 10:28:13.803292 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:28:13 crc kubenswrapper[4958]: E1201 10:28:13.803994 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.460357 4958 scope.go:117] "RemoveContainer" containerID="7da282837f15c5c330a6de482a830c5d5983b27e8b8422d3b3404e3df5d4677c" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.499445 4958 scope.go:117] "RemoveContainer" containerID="c0bfa711eee72c17bb35be6e2fb314864b58daa10417660b9896ddb1a68d6efb" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.548073 4958 scope.go:117] "RemoveContainer" containerID="1398aa8836f87ff8e2050f7a48826330e2d07470c0e7bec124b310a45ec5904e" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.580315 4958 scope.go:117] "RemoveContainer" containerID="31b1419cc86dd7611c8a535d42a0b25a61d4f290db0f3338f6d6fe69be403ae5" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.604155 4958 scope.go:117] "RemoveContainer" containerID="deb1c57b48cdb68fd86978602ee1e6c2e55a1fe310735f3a32fa71f717d847f5" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.635515 4958 scope.go:117] "RemoveContainer" containerID="c5e27fb4d7e81848fd212dbd23876fccd05db89724a17e1dceaecd9899ecd408" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.662143 4958 scope.go:117] "RemoveContainer" containerID="430f0f91f844715880d343270142d34691e2b25ebdb7c3450f79ba078ccef680" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.698371 4958 scope.go:117] "RemoveContainer" containerID="dbbea99c45871ffc560c5fecdb1ada0ff900af7e88a825fd2a8b7541c66b703d" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.735617 4958 scope.go:117] "RemoveContainer" containerID="2c5862a636d2560b2ccb30504701cdf2c56d0b52932f967751e00af5eddbd64b" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.775616 4958 scope.go:117] "RemoveContainer" containerID="78e6e5e6b60b39e59394d92a770491f46e85b9abae994ce35591f156c97b948a" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.800620 4958 scope.go:117] "RemoveContainer" containerID="ca54c1a38a9a153d513651af777d3c608aa78a98d64d4cff903ae8750c918ff9" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.821342 4958 scope.go:117] "RemoveContainer" containerID="a093c09932dcf809929bd91f191d41a237cb98ecb47ecd5032d496a5cc147ca2" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.847955 4958 scope.go:117] "RemoveContainer" containerID="d934e39a11b02d3b7b7d7d1ff2ed4bef1dea9bdfc1beaa6858543bf4d3d6b1f1" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.872034 4958 scope.go:117] "RemoveContainer" containerID="70bb5873249a6ef3c7a1f7875843e58f1ab41d3b195ac1f99ab710c1efd9105f" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.894044 4958 scope.go:117] "RemoveContainer" containerID="763193f6deea6c4c07d81eb81043b214061144fe250629bb1739bcf78abea07b" Dec 01 10:28:18 crc kubenswrapper[4958]: I1201 10:28:18.916349 4958 scope.go:117] "RemoveContainer" containerID="836428302d34363977d7a765ecb4dde51fd074295bc44cfb1a06531999e062bf" Dec 01 10:28:28 crc kubenswrapper[4958]: I1201 10:28:28.798238 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:28:28 crc kubenswrapper[4958]: E1201 10:28:28.799106 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:28:42 crc kubenswrapper[4958]: I1201 10:28:42.798103 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:28:42 crc kubenswrapper[4958]: E1201 10:28:42.799158 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:28:57 crc kubenswrapper[4958]: I1201 10:28:57.798449 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:28:57 crc kubenswrapper[4958]: E1201 10:28:57.799399 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:29:09 crc kubenswrapper[4958]: I1201 10:29:09.797582 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:29:09 crc kubenswrapper[4958]: E1201 10:29:09.798766 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.481505 4958 scope.go:117] "RemoveContainer" containerID="846a70f76927873aca913fa654d135f5d9f7402003f4d45aa4fb853953fe65db" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.518631 4958 scope.go:117] "RemoveContainer" containerID="a0cfc1461041500cfd7cf071381e4add7d3a6ba36c83834977f794ded000977b" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.588098 4958 scope.go:117] "RemoveContainer" containerID="f8e01024d3d1f7366ea417fc2cc49c8d648cbdcab196552e1b491adcaca19787" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.640541 4958 scope.go:117] "RemoveContainer" containerID="bc7066c75cdf8d7cc442eef94e9c6386e682d8da59456b2e98254a41c5f417c9" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.679506 4958 scope.go:117] "RemoveContainer" containerID="944f885a93b277fd5b446c26c7b9c7db505641ce99d66fc0eee7275265da1587" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.710831 4958 scope.go:117] "RemoveContainer" containerID="a5061e1758cadcbddcf9d7fd1235664eaefd5fcc99c2a0707ae4e34496238238" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.733832 4958 scope.go:117] "RemoveContainer" containerID="4b8027571ba9ea338e49eff27a4a3599eeda73550a750f774a719fece40e20fe" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.776952 4958 scope.go:117] "RemoveContainer" containerID="2c69fba39bcc23fd51b0e44fe893c7a695b06edee1de7171ac930180b17f1ebe" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.803509 4958 scope.go:117] "RemoveContainer" containerID="8d7386e6d905cc7e682cbc9fcbfef1b78be385a0409c687cf999a6495aba6952" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.825616 4958 scope.go:117] "RemoveContainer" containerID="0e6afa27cac7b16ea18ffb082028d7d4a5d1af380112b0080ebda5f523c8d154" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.846029 4958 scope.go:117] "RemoveContainer" containerID="7e4c8d5d756ae1f1f79baf6ab39a0885d96760214ce8bd8adfe3bf8d0265ce4e" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.872004 4958 scope.go:117] "RemoveContainer" containerID="67ff930450c58e6b7b81689867e0c06157ecda7661b4fa1b9f842961e72aa562" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.928695 4958 scope.go:117] "RemoveContainer" containerID="01d50f6c5d2b0216863f0e7c400427a1bc20e5a49e0b2002b3e0226d6cf84d7c" Dec 01 10:29:19 crc kubenswrapper[4958]: I1201 10:29:19.988658 4958 scope.go:117] "RemoveContainer" containerID="b1011c74e73144076e25dd1c4df809bab366b279b9f967a7d28bfcadc81ae769" Dec 01 10:29:20 crc kubenswrapper[4958]: I1201 10:29:20.021612 4958 scope.go:117] "RemoveContainer" containerID="943e338fab2cd272e835a178a99551fb42c61777d5c489b740c986254d67e962" Dec 01 10:29:23 crc kubenswrapper[4958]: I1201 10:29:23.805190 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:29:23 crc kubenswrapper[4958]: E1201 10:29:23.807030 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:29:36 crc kubenswrapper[4958]: I1201 10:29:36.797373 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:29:36 crc kubenswrapper[4958]: E1201 10:29:36.798263 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:29:51 crc kubenswrapper[4958]: I1201 10:29:51.798034 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:29:51 crc kubenswrapper[4958]: E1201 10:29:51.799140 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.234124 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt"] Dec 01 10:30:00 crc kubenswrapper[4958]: E1201 10:30:00.235301 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="extract-utilities" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.235324 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="extract-utilities" Dec 01 10:30:00 crc kubenswrapper[4958]: E1201 10:30:00.235352 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="registry-server" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.235361 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="registry-server" Dec 01 10:30:00 crc kubenswrapper[4958]: E1201 10:30:00.235385 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="extract-content" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.235391 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="extract-content" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.235572 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d06ccad-c540-4235-9bbe-a3fd3d6ebd56" containerName="registry-server" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.236350 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.239235 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.239677 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.248749 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt"] Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.427513 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14779d-958a-438c-95e8-721e994a1d6d-secret-volume\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.427573 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14779d-958a-438c-95e8-721e994a1d6d-config-volume\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.427655 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gxwm\" (UniqueName: \"kubernetes.io/projected/ec14779d-958a-438c-95e8-721e994a1d6d-kube-api-access-2gxwm\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.529053 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14779d-958a-438c-95e8-721e994a1d6d-secret-volume\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.529161 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14779d-958a-438c-95e8-721e994a1d6d-config-volume\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.529240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gxwm\" (UniqueName: \"kubernetes.io/projected/ec14779d-958a-438c-95e8-721e994a1d6d-kube-api-access-2gxwm\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.530642 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14779d-958a-438c-95e8-721e994a1d6d-config-volume\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.544341 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14779d-958a-438c-95e8-721e994a1d6d-secret-volume\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.550683 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gxwm\" (UniqueName: \"kubernetes.io/projected/ec14779d-958a-438c-95e8-721e994a1d6d-kube-api-access-2gxwm\") pod \"collect-profiles-29409750-gbtnt\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:00 crc kubenswrapper[4958]: I1201 10:30:00.563625 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:01 crc kubenswrapper[4958]: I1201 10:30:01.038145 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt"] Dec 01 10:30:01 crc kubenswrapper[4958]: I1201 10:30:01.454639 4958 generic.go:334] "Generic (PLEG): container finished" podID="ec14779d-958a-438c-95e8-721e994a1d6d" containerID="dbb51fdf4dcf5aa0e37ed33343a6ed85a8a48df098be737a2509423bcc49c5d2" exitCode=0 Dec 01 10:30:01 crc kubenswrapper[4958]: I1201 10:30:01.454722 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" event={"ID":"ec14779d-958a-438c-95e8-721e994a1d6d","Type":"ContainerDied","Data":"dbb51fdf4dcf5aa0e37ed33343a6ed85a8a48df098be737a2509423bcc49c5d2"} Dec 01 10:30:01 crc kubenswrapper[4958]: I1201 10:30:01.455040 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" event={"ID":"ec14779d-958a-438c-95e8-721e994a1d6d","Type":"ContainerStarted","Data":"b8c1a48eae84634fe2f0bf70d19b506b3341e1bdb9a32bbc45fbc0773a3af886"} Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.791814 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.796906 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:30:02 crc kubenswrapper[4958]: E1201 10:30:02.797263 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.977277 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14779d-958a-438c-95e8-721e994a1d6d-config-volume\") pod \"ec14779d-958a-438c-95e8-721e994a1d6d\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.977413 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14779d-958a-438c-95e8-721e994a1d6d-secret-volume\") pod \"ec14779d-958a-438c-95e8-721e994a1d6d\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.977544 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gxwm\" (UniqueName: \"kubernetes.io/projected/ec14779d-958a-438c-95e8-721e994a1d6d-kube-api-access-2gxwm\") pod \"ec14779d-958a-438c-95e8-721e994a1d6d\" (UID: \"ec14779d-958a-438c-95e8-721e994a1d6d\") " Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.978732 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec14779d-958a-438c-95e8-721e994a1d6d-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec14779d-958a-438c-95e8-721e994a1d6d" (UID: "ec14779d-958a-438c-95e8-721e994a1d6d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.984801 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec14779d-958a-438c-95e8-721e994a1d6d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec14779d-958a-438c-95e8-721e994a1d6d" (UID: "ec14779d-958a-438c-95e8-721e994a1d6d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:30:02 crc kubenswrapper[4958]: I1201 10:30:02.998686 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec14779d-958a-438c-95e8-721e994a1d6d-kube-api-access-2gxwm" (OuterVolumeSpecName: "kube-api-access-2gxwm") pod "ec14779d-958a-438c-95e8-721e994a1d6d" (UID: "ec14779d-958a-438c-95e8-721e994a1d6d"). InnerVolumeSpecName "kube-api-access-2gxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:30:03 crc kubenswrapper[4958]: I1201 10:30:03.079740 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec14779d-958a-438c-95e8-721e994a1d6d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:03 crc kubenswrapper[4958]: I1201 10:30:03.079818 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gxwm\" (UniqueName: \"kubernetes.io/projected/ec14779d-958a-438c-95e8-721e994a1d6d-kube-api-access-2gxwm\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:03 crc kubenswrapper[4958]: I1201 10:30:03.079839 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec14779d-958a-438c-95e8-721e994a1d6d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:03 crc kubenswrapper[4958]: I1201 10:30:03.477291 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" event={"ID":"ec14779d-958a-438c-95e8-721e994a1d6d","Type":"ContainerDied","Data":"b8c1a48eae84634fe2f0bf70d19b506b3341e1bdb9a32bbc45fbc0773a3af886"} Dec 01 10:30:03 crc kubenswrapper[4958]: I1201 10:30:03.477760 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8c1a48eae84634fe2f0bf70d19b506b3341e1bdb9a32bbc45fbc0773a3af886" Dec 01 10:30:03 crc kubenswrapper[4958]: I1201 10:30:03.477382 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt" Dec 01 10:30:14 crc kubenswrapper[4958]: I1201 10:30:14.798322 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:30:14 crc kubenswrapper[4958]: E1201 10:30:14.799633 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.273219 4958 scope.go:117] "RemoveContainer" containerID="e80dc32b41343dbfc30bfd6b78f32bf55eb0ee1658dc3ebf612e8b8f2e8503f1" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.309974 4958 scope.go:117] "RemoveContainer" containerID="75ef5854c112506bddc38901be3f8f0d961a7821a0620ac683dbab287bec31b2" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.376260 4958 scope.go:117] "RemoveContainer" containerID="912676adc0639820ea8a098452e8e4b85618ac2b2c6eda07e38c9a1e111eefc8" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.399583 4958 scope.go:117] "RemoveContainer" containerID="10ffeb225a861504e1ad9f727e8c89225a65e1049e0b9c646991e351339be630" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.435254 4958 scope.go:117] "RemoveContainer" containerID="5ba1d7fe6b72c2cd6f3b2f648908da870b194812c52cf3a428bff1214abe4914" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.474436 4958 scope.go:117] "RemoveContainer" containerID="077cb1d015bf3a6a753139b2136a56d260fff0902dafa0b2362075ef162e2616" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.511625 4958 scope.go:117] "RemoveContainer" containerID="cfeb39b686f44466325e4103c9715785337ae56a4d6010ba306c05d7d1433916" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.540723 4958 scope.go:117] "RemoveContainer" containerID="7d7a5225745487604efb9f143377215aa48f67ee4250f1a8e483b04bf43b8176" Dec 01 10:30:20 crc kubenswrapper[4958]: I1201 10:30:20.565127 4958 scope.go:117] "RemoveContainer" containerID="5bda70cb089e79ebe24f795c82ed089dd6bc77e03b0ab3ec28946a52904bf64b" Dec 01 10:30:27 crc kubenswrapper[4958]: I1201 10:30:27.798116 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:30:27 crc kubenswrapper[4958]: E1201 10:30:27.799265 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:30:40 crc kubenswrapper[4958]: I1201 10:30:40.797787 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:30:41 crc kubenswrapper[4958]: I1201 10:30:41.014830 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"76d51111155250b47587fb99acb61eecae311759f1a6cd0cb3e2d28ffb1cec45"} Dec 01 10:31:20 crc kubenswrapper[4958]: I1201 10:31:20.774835 4958 scope.go:117] "RemoveContainer" containerID="63c03bfaba87edf547b131ec62039fd6a3dce67973e3a6dd92756348441874d8" Dec 01 10:31:20 crc kubenswrapper[4958]: I1201 10:31:20.839257 4958 scope.go:117] "RemoveContainer" containerID="0c45fd6464c1e425a10c1b58a4752dccdff825de5fe5bc44db608fe475975eb9" Dec 01 10:32:20 crc kubenswrapper[4958]: I1201 10:32:20.960044 4958 scope.go:117] "RemoveContainer" containerID="05071c092989d02335035ebd45342eedf9166ca955a2f85c2d8550907c62b4ea" Dec 01 10:32:58 crc kubenswrapper[4958]: I1201 10:32:58.210510 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:32:58 crc kubenswrapper[4958]: I1201 10:32:58.211460 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:33:21 crc kubenswrapper[4958]: I1201 10:33:21.045624 4958 scope.go:117] "RemoveContainer" containerID="8168f6cbe0bc798da68a2e2d2fe7de44eece7fb512c444ab5a26c173069687de" Dec 01 10:33:28 crc kubenswrapper[4958]: I1201 10:33:28.211944 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:33:28 crc kubenswrapper[4958]: I1201 10:33:28.215066 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.678479 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 10:33:42 crc kubenswrapper[4958]: E1201 10:33:42.679660 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec14779d-958a-438c-95e8-721e994a1d6d" containerName="collect-profiles" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.679682 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec14779d-958a-438c-95e8-721e994a1d6d" containerName="collect-profiles" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.679911 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec14779d-958a-438c-95e8-721e994a1d6d" containerName="collect-profiles" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.682018 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.706103 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.750437 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-catalog-content\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.750704 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-utilities\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.750933 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9q4j\" (UniqueName: \"kubernetes.io/projected/590d4089-bce0-4fcf-8dec-74f7b96675ce-kube-api-access-p9q4j\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.852870 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-utilities\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.851925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-utilities\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.853044 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9q4j\" (UniqueName: \"kubernetes.io/projected/590d4089-bce0-4fcf-8dec-74f7b96675ce-kube-api-access-p9q4j\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.853947 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-catalog-content\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.854392 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-catalog-content\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:42 crc kubenswrapper[4958]: I1201 10:33:42.892466 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9q4j\" (UniqueName: \"kubernetes.io/projected/590d4089-bce0-4fcf-8dec-74f7b96675ce-kube-api-access-p9q4j\") pod \"redhat-operators-4f8qt\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:43 crc kubenswrapper[4958]: I1201 10:33:43.031435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:33:43 crc kubenswrapper[4958]: I1201 10:33:43.529046 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 10:33:43 crc kubenswrapper[4958]: I1201 10:33:43.956354 4958 generic.go:334] "Generic (PLEG): container finished" podID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerID="fcf8311b84539ff0cd04af0c55a2f50ad32a42f2ad7a789a1a8b45de84070fd6" exitCode=0 Dec 01 10:33:43 crc kubenswrapper[4958]: I1201 10:33:43.956487 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerDied","Data":"fcf8311b84539ff0cd04af0c55a2f50ad32a42f2ad7a789a1a8b45de84070fd6"} Dec 01 10:33:43 crc kubenswrapper[4958]: I1201 10:33:43.956876 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerStarted","Data":"51c8736d00339e7a9621ed3cdf4c794b1f8f7575e9dcf71da589383e2fc3e3d0"} Dec 01 10:33:43 crc kubenswrapper[4958]: I1201 10:33:43.958993 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:33:54 crc kubenswrapper[4958]: I1201 10:33:54.056596 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerStarted","Data":"b2546978c9857b6f555f42f0b1c681a91522eceb55a613474c8abb4f0b8a9b7e"} Dec 01 10:33:55 crc kubenswrapper[4958]: I1201 10:33:55.067757 4958 generic.go:334] "Generic (PLEG): container finished" podID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerID="b2546978c9857b6f555f42f0b1c681a91522eceb55a613474c8abb4f0b8a9b7e" exitCode=0 Dec 01 10:33:55 crc kubenswrapper[4958]: I1201 10:33:55.067874 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerDied","Data":"b2546978c9857b6f555f42f0b1c681a91522eceb55a613474c8abb4f0b8a9b7e"} Dec 01 10:33:56 crc kubenswrapper[4958]: I1201 10:33:56.077928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerStarted","Data":"793ac3c60db3cdec3119f5eda9e48bc3ebf00734b7139e5434275e17240334ca"} Dec 01 10:33:56 crc kubenswrapper[4958]: I1201 10:33:56.106257 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4f8qt" podStartSLOduration=2.327257409 podStartE2EDuration="14.106211255s" podCreationTimestamp="2025-12-01 10:33:42 +0000 UTC" firstStartedPulling="2025-12-01 10:33:43.95866485 +0000 UTC m=+2071.467453877" lastFinishedPulling="2025-12-01 10:33:55.737618686 +0000 UTC m=+2083.246407723" observedRunningTime="2025-12-01 10:33:56.096897297 +0000 UTC m=+2083.605686354" watchObservedRunningTime="2025-12-01 10:33:56.106211255 +0000 UTC m=+2083.615000292" Dec 01 10:33:58 crc kubenswrapper[4958]: I1201 10:33:58.210452 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:33:58 crc kubenswrapper[4958]: I1201 10:33:58.211876 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:33:58 crc kubenswrapper[4958]: I1201 10:33:58.212043 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:33:58 crc kubenswrapper[4958]: I1201 10:33:58.212997 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76d51111155250b47587fb99acb61eecae311759f1a6cd0cb3e2d28ffb1cec45"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:33:58 crc kubenswrapper[4958]: I1201 10:33:58.213174 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://76d51111155250b47587fb99acb61eecae311759f1a6cd0cb3e2d28ffb1cec45" gracePeriod=600 Dec 01 10:33:59 crc kubenswrapper[4958]: I1201 10:33:59.105362 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="76d51111155250b47587fb99acb61eecae311759f1a6cd0cb3e2d28ffb1cec45" exitCode=0 Dec 01 10:33:59 crc kubenswrapper[4958]: I1201 10:33:59.105460 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"76d51111155250b47587fb99acb61eecae311759f1a6cd0cb3e2d28ffb1cec45"} Dec 01 10:33:59 crc kubenswrapper[4958]: I1201 10:33:59.106629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641"} Dec 01 10:33:59 crc kubenswrapper[4958]: I1201 10:33:59.106680 4958 scope.go:117] "RemoveContainer" containerID="25eb1dd9cc216d1e92dd2f3498bb3b4809300fecdb99552887261ba798b810c4" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.031699 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.032103 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.078868 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.188083 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.264233 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.316295 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6l"] Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.316609 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hmn6l" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="registry-server" containerID="cri-o://23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b" gracePeriod=2 Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.826615 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.919314 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-catalog-content\") pod \"d4b4db44-22c2-4597-8353-34af1d55f49a\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.919400 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-utilities\") pod \"d4b4db44-22c2-4597-8353-34af1d55f49a\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.919478 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv7wv\" (UniqueName: \"kubernetes.io/projected/d4b4db44-22c2-4597-8353-34af1d55f49a-kube-api-access-sv7wv\") pod \"d4b4db44-22c2-4597-8353-34af1d55f49a\" (UID: \"d4b4db44-22c2-4597-8353-34af1d55f49a\") " Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.922440 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-utilities" (OuterVolumeSpecName: "utilities") pod "d4b4db44-22c2-4597-8353-34af1d55f49a" (UID: "d4b4db44-22c2-4597-8353-34af1d55f49a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:03 crc kubenswrapper[4958]: I1201 10:34:03.927902 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b4db44-22c2-4597-8353-34af1d55f49a-kube-api-access-sv7wv" (OuterVolumeSpecName: "kube-api-access-sv7wv") pod "d4b4db44-22c2-4597-8353-34af1d55f49a" (UID: "d4b4db44-22c2-4597-8353-34af1d55f49a"). InnerVolumeSpecName "kube-api-access-sv7wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.024200 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv7wv\" (UniqueName: \"kubernetes.io/projected/d4b4db44-22c2-4597-8353-34af1d55f49a-kube-api-access-sv7wv\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.024252 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.035531 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b4db44-22c2-4597-8353-34af1d55f49a" (UID: "d4b4db44-22c2-4597-8353-34af1d55f49a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.126374 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b4db44-22c2-4597-8353-34af1d55f49a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.155299 4958 generic.go:334] "Generic (PLEG): container finished" podID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerID="23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b" exitCode=0 Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.155382 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerDied","Data":"23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b"} Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.155694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6l" event={"ID":"d4b4db44-22c2-4597-8353-34af1d55f49a","Type":"ContainerDied","Data":"815e0d1cbaba30e7ff2f730304eeee7a55ccc5701c1a89a35b58beb2d8ad60c4"} Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.155729 4958 scope.go:117] "RemoveContainer" containerID="23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.155427 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6l" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.184304 4958 scope.go:117] "RemoveContainer" containerID="b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.207285 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6l"] Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.214731 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6l"] Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.231933 4958 scope.go:117] "RemoveContainer" containerID="ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.256961 4958 scope.go:117] "RemoveContainer" containerID="23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b" Dec 01 10:34:04 crc kubenswrapper[4958]: E1201 10:34:04.257663 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b\": container with ID starting with 23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b not found: ID does not exist" containerID="23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.257723 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b"} err="failed to get container status \"23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b\": rpc error: code = NotFound desc = could not find container \"23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b\": container with ID starting with 23dda11c5bdd6f5013671460531d5744391968c28fd8ff6f67a840cc00b2703b not found: ID does not exist" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.257761 4958 scope.go:117] "RemoveContainer" containerID="b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b" Dec 01 10:34:04 crc kubenswrapper[4958]: E1201 10:34:04.258226 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b\": container with ID starting with b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b not found: ID does not exist" containerID="b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.258260 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b"} err="failed to get container status \"b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b\": rpc error: code = NotFound desc = could not find container \"b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b\": container with ID starting with b6655ebe7fcad8af0e9dc1f4e77b533423b7e796a4a8640cfc7993be8ac0ee1b not found: ID does not exist" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.258277 4958 scope.go:117] "RemoveContainer" containerID="ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada" Dec 01 10:34:04 crc kubenswrapper[4958]: E1201 10:34:04.258573 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada\": container with ID starting with ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada not found: ID does not exist" containerID="ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada" Dec 01 10:34:04 crc kubenswrapper[4958]: I1201 10:34:04.258604 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada"} err="failed to get container status \"ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada\": rpc error: code = NotFound desc = could not find container \"ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada\": container with ID starting with ac53b611ca612c64fc4f4a774d8b160fa5ba5120cb58c8bb1f8e5b85f7102ada not found: ID does not exist" Dec 01 10:34:05 crc kubenswrapper[4958]: I1201 10:34:05.809337 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" path="/var/lib/kubelet/pods/d4b4db44-22c2-4597-8353-34af1d55f49a/volumes" Dec 01 10:35:58 crc kubenswrapper[4958]: I1201 10:35:58.211506 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:35:58 crc kubenswrapper[4958]: I1201 10:35:58.212244 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:36:28 crc kubenswrapper[4958]: I1201 10:36:28.210752 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:36:28 crc kubenswrapper[4958]: I1201 10:36:28.211669 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.587685 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4mrt7"] Dec 01 10:36:36 crc kubenswrapper[4958]: E1201 10:36:36.588999 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="registry-server" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.589018 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="registry-server" Dec 01 10:36:36 crc kubenswrapper[4958]: E1201 10:36:36.589037 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="extract-content" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.589046 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="extract-content" Dec 01 10:36:36 crc kubenswrapper[4958]: E1201 10:36:36.589070 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="extract-utilities" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.589079 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="extract-utilities" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.589274 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b4db44-22c2-4597-8353-34af1d55f49a" containerName="registry-server" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.590968 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.660002 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4mrt7"] Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.777951 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-utilities\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.778071 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5dt\" (UniqueName: \"kubernetes.io/projected/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-kube-api-access-gx5dt\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.778326 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-catalog-content\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.879427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-catalog-content\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.880000 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-utilities\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.880214 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5dt\" (UniqueName: \"kubernetes.io/projected/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-kube-api-access-gx5dt\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.880513 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-catalog-content\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.880793 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-utilities\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.907135 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5dt\" (UniqueName: \"kubernetes.io/projected/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-kube-api-access-gx5dt\") pod \"certified-operators-4mrt7\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:36 crc kubenswrapper[4958]: I1201 10:36:36.934651 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:37 crc kubenswrapper[4958]: I1201 10:36:37.566552 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4mrt7"] Dec 01 10:36:37 crc kubenswrapper[4958]: I1201 10:36:37.623450 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerStarted","Data":"43da71d10f5c71f550750dd3b0e4af928f72d1f822644a162e92390117b567e8"} Dec 01 10:36:38 crc kubenswrapper[4958]: I1201 10:36:38.700351 4958 generic.go:334] "Generic (PLEG): container finished" podID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerID="3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5" exitCode=0 Dec 01 10:36:38 crc kubenswrapper[4958]: I1201 10:36:38.700408 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerDied","Data":"3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5"} Dec 01 10:36:39 crc kubenswrapper[4958]: I1201 10:36:39.719717 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerStarted","Data":"174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3"} Dec 01 10:36:40 crc kubenswrapper[4958]: I1201 10:36:40.730291 4958 generic.go:334] "Generic (PLEG): container finished" podID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerID="174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3" exitCode=0 Dec 01 10:36:40 crc kubenswrapper[4958]: I1201 10:36:40.731151 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerDied","Data":"174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3"} Dec 01 10:36:42 crc kubenswrapper[4958]: I1201 10:36:42.756607 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerStarted","Data":"bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404"} Dec 01 10:36:42 crc kubenswrapper[4958]: I1201 10:36:42.784401 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4mrt7" podStartSLOduration=3.952382884 podStartE2EDuration="6.784360571s" podCreationTimestamp="2025-12-01 10:36:36 +0000 UTC" firstStartedPulling="2025-12-01 10:36:38.704371392 +0000 UTC m=+2246.213160429" lastFinishedPulling="2025-12-01 10:36:41.536349039 +0000 UTC m=+2249.045138116" observedRunningTime="2025-12-01 10:36:42.781630533 +0000 UTC m=+2250.290419590" watchObservedRunningTime="2025-12-01 10:36:42.784360571 +0000 UTC m=+2250.293149608" Dec 01 10:36:46 crc kubenswrapper[4958]: I1201 10:36:46.935650 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:46 crc kubenswrapper[4958]: I1201 10:36:46.936258 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:47 crc kubenswrapper[4958]: I1201 10:36:47.024053 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:47 crc kubenswrapper[4958]: I1201 10:36:47.870092 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:48 crc kubenswrapper[4958]: I1201 10:36:48.542525 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4mrt7"] Dec 01 10:36:49 crc kubenswrapper[4958]: I1201 10:36:49.830564 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4mrt7" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="registry-server" containerID="cri-o://bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404" gracePeriod=2 Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.296937 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.420421 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-catalog-content\") pod \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.420493 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5dt\" (UniqueName: \"kubernetes.io/projected/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-kube-api-access-gx5dt\") pod \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.420600 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-utilities\") pod \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\" (UID: \"1496bc40-bc52-4b9c-a8e6-3aaab40a5390\") " Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.421905 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-utilities" (OuterVolumeSpecName: "utilities") pod "1496bc40-bc52-4b9c-a8e6-3aaab40a5390" (UID: "1496bc40-bc52-4b9c-a8e6-3aaab40a5390"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.429105 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-kube-api-access-gx5dt" (OuterVolumeSpecName: "kube-api-access-gx5dt") pod "1496bc40-bc52-4b9c-a8e6-3aaab40a5390" (UID: "1496bc40-bc52-4b9c-a8e6-3aaab40a5390"). InnerVolumeSpecName "kube-api-access-gx5dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.522892 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.522973 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5dt\" (UniqueName: \"kubernetes.io/projected/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-kube-api-access-gx5dt\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.846126 4958 generic.go:334] "Generic (PLEG): container finished" podID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerID="bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404" exitCode=0 Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.846187 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerDied","Data":"bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404"} Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.846230 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mrt7" event={"ID":"1496bc40-bc52-4b9c-a8e6-3aaab40a5390","Type":"ContainerDied","Data":"43da71d10f5c71f550750dd3b0e4af928f72d1f822644a162e92390117b567e8"} Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.846336 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mrt7" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.846369 4958 scope.go:117] "RemoveContainer" containerID="bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.858572 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1496bc40-bc52-4b9c-a8e6-3aaab40a5390" (UID: "1496bc40-bc52-4b9c-a8e6-3aaab40a5390"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.888833 4958 scope.go:117] "RemoveContainer" containerID="174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.922222 4958 scope.go:117] "RemoveContainer" containerID="3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.930047 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1496bc40-bc52-4b9c-a8e6-3aaab40a5390-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.951258 4958 scope.go:117] "RemoveContainer" containerID="bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404" Dec 01 10:36:50 crc kubenswrapper[4958]: E1201 10:36:50.952185 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404\": container with ID starting with bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404 not found: ID does not exist" containerID="bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.952316 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404"} err="failed to get container status \"bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404\": rpc error: code = NotFound desc = could not find container \"bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404\": container with ID starting with bdc2f9c4d388a4cf008c9e2f7e474e44f8dbe5b43b9fca6579e5305208223404 not found: ID does not exist" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.952457 4958 scope.go:117] "RemoveContainer" containerID="174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3" Dec 01 10:36:50 crc kubenswrapper[4958]: E1201 10:36:50.953024 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3\": container with ID starting with 174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3 not found: ID does not exist" containerID="174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.953157 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3"} err="failed to get container status \"174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3\": rpc error: code = NotFound desc = could not find container \"174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3\": container with ID starting with 174a16659ab22498e17e7e6376fca659fe13112971c14cbc3d52d4d2314823f3 not found: ID does not exist" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.953280 4958 scope.go:117] "RemoveContainer" containerID="3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5" Dec 01 10:36:50 crc kubenswrapper[4958]: E1201 10:36:50.953860 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5\": container with ID starting with 3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5 not found: ID does not exist" containerID="3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5" Dec 01 10:36:50 crc kubenswrapper[4958]: I1201 10:36:50.953987 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5"} err="failed to get container status \"3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5\": rpc error: code = NotFound desc = could not find container \"3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5\": container with ID starting with 3102e2979b97bd46c32c20ad6c6b4fe563ec3121b1b76210382cdff9681682c5 not found: ID does not exist" Dec 01 10:36:51 crc kubenswrapper[4958]: I1201 10:36:51.189017 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4mrt7"] Dec 01 10:36:51 crc kubenswrapper[4958]: I1201 10:36:51.195108 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4mrt7"] Dec 01 10:36:51 crc kubenswrapper[4958]: I1201 10:36:51.810743 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" path="/var/lib/kubelet/pods/1496bc40-bc52-4b9c-a8e6-3aaab40a5390/volumes" Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.210914 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.211642 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.211707 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.212569 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.212644 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" gracePeriod=600 Dec 01 10:36:58 crc kubenswrapper[4958]: E1201 10:36:58.347098 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.932204 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" exitCode=0 Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.932291 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641"} Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.932390 4958 scope.go:117] "RemoveContainer" containerID="76d51111155250b47587fb99acb61eecae311759f1a6cd0cb3e2d28ffb1cec45" Dec 01 10:36:58 crc kubenswrapper[4958]: I1201 10:36:58.933372 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:36:58 crc kubenswrapper[4958]: E1201 10:36:58.933900 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.573152 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9lcf"] Dec 01 10:37:09 crc kubenswrapper[4958]: E1201 10:37:09.574624 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="extract-content" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.574668 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="extract-content" Dec 01 10:37:09 crc kubenswrapper[4958]: E1201 10:37:09.574707 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="extract-utilities" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.574725 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="extract-utilities" Dec 01 10:37:09 crc kubenswrapper[4958]: E1201 10:37:09.574771 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="registry-server" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.574788 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="registry-server" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.575265 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1496bc40-bc52-4b9c-a8e6-3aaab40a5390" containerName="registry-server" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.577343 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.587431 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9lcf"] Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.679235 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-catalog-content\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.679311 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4jp\" (UniqueName: \"kubernetes.io/projected/b620059d-d8d6-49d6-abeb-b1fe1f505d34-kube-api-access-rk4jp\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.679345 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-utilities\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.780715 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-catalog-content\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.780777 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4jp\" (UniqueName: \"kubernetes.io/projected/b620059d-d8d6-49d6-abeb-b1fe1f505d34-kube-api-access-rk4jp\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.780822 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-utilities\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.781555 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-catalog-content\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.781592 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-utilities\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.802368 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4jp\" (UniqueName: \"kubernetes.io/projected/b620059d-d8d6-49d6-abeb-b1fe1f505d34-kube-api-access-rk4jp\") pod \"community-operators-r9lcf\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:09 crc kubenswrapper[4958]: I1201 10:37:09.900765 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:10 crc kubenswrapper[4958]: I1201 10:37:10.231980 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9lcf"] Dec 01 10:37:11 crc kubenswrapper[4958]: I1201 10:37:11.076685 4958 generic.go:334] "Generic (PLEG): container finished" podID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerID="ce74dc84ac31da4fe1d550c93bfed94fa390a5570e08bc0c449b34118169318b" exitCode=0 Dec 01 10:37:11 crc kubenswrapper[4958]: I1201 10:37:11.076816 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerDied","Data":"ce74dc84ac31da4fe1d550c93bfed94fa390a5570e08bc0c449b34118169318b"} Dec 01 10:37:11 crc kubenswrapper[4958]: I1201 10:37:11.077204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerStarted","Data":"d0f2b4a8ec390767b2a6e43cf496e89db9e7f1e4aa3e8460f9ccacee20bb1684"} Dec 01 10:37:12 crc kubenswrapper[4958]: I1201 10:37:12.087788 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerStarted","Data":"5d5c1527af16ea9a02e8d09df568ad5c4c77ab30471fc98ce10c4ebb3701a1d7"} Dec 01 10:37:12 crc kubenswrapper[4958]: I1201 10:37:12.797943 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:37:12 crc kubenswrapper[4958]: E1201 10:37:12.798405 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:37:13 crc kubenswrapper[4958]: I1201 10:37:13.102563 4958 generic.go:334] "Generic (PLEG): container finished" podID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerID="5d5c1527af16ea9a02e8d09df568ad5c4c77ab30471fc98ce10c4ebb3701a1d7" exitCode=0 Dec 01 10:37:13 crc kubenswrapper[4958]: I1201 10:37:13.102629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerDied","Data":"5d5c1527af16ea9a02e8d09df568ad5c4c77ab30471fc98ce10c4ebb3701a1d7"} Dec 01 10:37:14 crc kubenswrapper[4958]: I1201 10:37:14.115255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerStarted","Data":"22afdc2a7af7d1d235f9f511df00b91b0082a5270933e40da25004a10b367e7e"} Dec 01 10:37:14 crc kubenswrapper[4958]: I1201 10:37:14.154086 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9lcf" podStartSLOduration=2.70043079 podStartE2EDuration="5.154050518s" podCreationTimestamp="2025-12-01 10:37:09 +0000 UTC" firstStartedPulling="2025-12-01 10:37:11.081358196 +0000 UTC m=+2278.590147233" lastFinishedPulling="2025-12-01 10:37:13.534977914 +0000 UTC m=+2281.043766961" observedRunningTime="2025-12-01 10:37:14.149407935 +0000 UTC m=+2281.658197012" watchObservedRunningTime="2025-12-01 10:37:14.154050518 +0000 UTC m=+2281.662839565" Dec 01 10:37:19 crc kubenswrapper[4958]: I1201 10:37:19.902401 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:19 crc kubenswrapper[4958]: I1201 10:37:19.903431 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:19 crc kubenswrapper[4958]: I1201 10:37:19.993640 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:20 crc kubenswrapper[4958]: I1201 10:37:20.264043 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:20 crc kubenswrapper[4958]: I1201 10:37:20.640008 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9lcf"] Dec 01 10:37:22 crc kubenswrapper[4958]: I1201 10:37:22.207496 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r9lcf" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="registry-server" containerID="cri-o://22afdc2a7af7d1d235f9f511df00b91b0082a5270933e40da25004a10b367e7e" gracePeriod=2 Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.276861 4958 generic.go:334] "Generic (PLEG): container finished" podID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerID="22afdc2a7af7d1d235f9f511df00b91b0082a5270933e40da25004a10b367e7e" exitCode=0 Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.276909 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerDied","Data":"22afdc2a7af7d1d235f9f511df00b91b0082a5270933e40da25004a10b367e7e"} Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.380646 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.463887 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4jp\" (UniqueName: \"kubernetes.io/projected/b620059d-d8d6-49d6-abeb-b1fe1f505d34-kube-api-access-rk4jp\") pod \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.463944 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-catalog-content\") pod \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.464039 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-utilities\") pod \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\" (UID: \"b620059d-d8d6-49d6-abeb-b1fe1f505d34\") " Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.468189 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-utilities" (OuterVolumeSpecName: "utilities") pod "b620059d-d8d6-49d6-abeb-b1fe1f505d34" (UID: "b620059d-d8d6-49d6-abeb-b1fe1f505d34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.470424 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b620059d-d8d6-49d6-abeb-b1fe1f505d34-kube-api-access-rk4jp" (OuterVolumeSpecName: "kube-api-access-rk4jp") pod "b620059d-d8d6-49d6-abeb-b1fe1f505d34" (UID: "b620059d-d8d6-49d6-abeb-b1fe1f505d34"). InnerVolumeSpecName "kube-api-access-rk4jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.534488 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b620059d-d8d6-49d6-abeb-b1fe1f505d34" (UID: "b620059d-d8d6-49d6-abeb-b1fe1f505d34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.565493 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4jp\" (UniqueName: \"kubernetes.io/projected/b620059d-d8d6-49d6-abeb-b1fe1f505d34-kube-api-access-rk4jp\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.565527 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:23 crc kubenswrapper[4958]: I1201 10:37:23.565543 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b620059d-d8d6-49d6-abeb-b1fe1f505d34-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.292824 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9lcf" event={"ID":"b620059d-d8d6-49d6-abeb-b1fe1f505d34","Type":"ContainerDied","Data":"d0f2b4a8ec390767b2a6e43cf496e89db9e7f1e4aa3e8460f9ccacee20bb1684"} Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.292970 4958 scope.go:117] "RemoveContainer" containerID="22afdc2a7af7d1d235f9f511df00b91b0082a5270933e40da25004a10b367e7e" Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.292896 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9lcf" Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.352223 4958 scope.go:117] "RemoveContainer" containerID="5d5c1527af16ea9a02e8d09df568ad5c4c77ab30471fc98ce10c4ebb3701a1d7" Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.371679 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9lcf"] Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.381533 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r9lcf"] Dec 01 10:37:24 crc kubenswrapper[4958]: I1201 10:37:24.382703 4958 scope.go:117] "RemoveContainer" containerID="ce74dc84ac31da4fe1d550c93bfed94fa390a5570e08bc0c449b34118169318b" Dec 01 10:37:25 crc kubenswrapper[4958]: I1201 10:37:25.817111 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" path="/var/lib/kubelet/pods/b620059d-d8d6-49d6-abeb-b1fe1f505d34/volumes" Dec 01 10:37:27 crc kubenswrapper[4958]: I1201 10:37:27.492208 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:37:27 crc kubenswrapper[4958]: E1201 10:37:27.493039 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:37:40 crc kubenswrapper[4958]: I1201 10:37:40.798809 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:37:40 crc kubenswrapper[4958]: E1201 10:37:40.800190 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:37:55 crc kubenswrapper[4958]: I1201 10:37:55.797754 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:37:55 crc kubenswrapper[4958]: E1201 10:37:55.798701 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.222618 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vhvzc"] Dec 01 10:38:06 crc kubenswrapper[4958]: E1201 10:38:06.229832 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="registry-server" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.229954 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="registry-server" Dec 01 10:38:06 crc kubenswrapper[4958]: E1201 10:38:06.230015 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="extract-content" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.230034 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="extract-content" Dec 01 10:38:06 crc kubenswrapper[4958]: E1201 10:38:06.230077 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="extract-utilities" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.230101 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="extract-utilities" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.230449 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b620059d-d8d6-49d6-abeb-b1fe1f505d34" containerName="registry-server" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.233231 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.254829 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhvzc"] Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.343960 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-utilities\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.344275 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-catalog-content\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.344564 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mk8\" (UniqueName: \"kubernetes.io/projected/91b5b869-275c-43f7-ad71-ee725a3a5094-kube-api-access-g7mk8\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.445362 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-utilities\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.445462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-catalog-content\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.445553 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mk8\" (UniqueName: \"kubernetes.io/projected/91b5b869-275c-43f7-ad71-ee725a3a5094-kube-api-access-g7mk8\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.447250 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-utilities\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.447365 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-catalog-content\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.471861 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mk8\" (UniqueName: \"kubernetes.io/projected/91b5b869-275c-43f7-ad71-ee725a3a5094-kube-api-access-g7mk8\") pod \"redhat-marketplace-vhvzc\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:06 crc kubenswrapper[4958]: I1201 10:38:06.636524 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:07 crc kubenswrapper[4958]: I1201 10:38:07.092611 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhvzc"] Dec 01 10:38:07 crc kubenswrapper[4958]: I1201 10:38:07.917496 4958 generic.go:334] "Generic (PLEG): container finished" podID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerID="35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02" exitCode=0 Dec 01 10:38:07 crc kubenswrapper[4958]: I1201 10:38:07.917648 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerDied","Data":"35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02"} Dec 01 10:38:07 crc kubenswrapper[4958]: I1201 10:38:07.918013 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerStarted","Data":"8e6b1e0c18759d7c99fdeea78a82e6e3e3cf599fcd50c98114ea04feec9e81fe"} Dec 01 10:38:08 crc kubenswrapper[4958]: I1201 10:38:08.799076 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:38:08 crc kubenswrapper[4958]: E1201 10:38:08.799596 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:38:08 crc kubenswrapper[4958]: I1201 10:38:08.928487 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerStarted","Data":"ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253"} Dec 01 10:38:09 crc kubenswrapper[4958]: I1201 10:38:09.950354 4958 generic.go:334] "Generic (PLEG): container finished" podID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerID="ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253" exitCode=0 Dec 01 10:38:09 crc kubenswrapper[4958]: I1201 10:38:09.950924 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerDied","Data":"ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253"} Dec 01 10:38:10 crc kubenswrapper[4958]: I1201 10:38:10.961781 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerStarted","Data":"758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211"} Dec 01 10:38:10 crc kubenswrapper[4958]: I1201 10:38:10.986975 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vhvzc" podStartSLOduration=2.377251619 podStartE2EDuration="4.986937719s" podCreationTimestamp="2025-12-01 10:38:06 +0000 UTC" firstStartedPulling="2025-12-01 10:38:07.920565799 +0000 UTC m=+2335.429354846" lastFinishedPulling="2025-12-01 10:38:10.530251869 +0000 UTC m=+2338.039040946" observedRunningTime="2025-12-01 10:38:10.980438682 +0000 UTC m=+2338.489227729" watchObservedRunningTime="2025-12-01 10:38:10.986937719 +0000 UTC m=+2338.495726766" Dec 01 10:38:16 crc kubenswrapper[4958]: I1201 10:38:16.637251 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:16 crc kubenswrapper[4958]: I1201 10:38:16.638240 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:16 crc kubenswrapper[4958]: I1201 10:38:16.722039 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:17 crc kubenswrapper[4958]: I1201 10:38:17.114131 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:17 crc kubenswrapper[4958]: I1201 10:38:17.193117 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhvzc"] Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.056620 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vhvzc" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="registry-server" containerID="cri-o://758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211" gracePeriod=2 Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.474400 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.578918 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-utilities\") pod \"91b5b869-275c-43f7-ad71-ee725a3a5094\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.579030 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-catalog-content\") pod \"91b5b869-275c-43f7-ad71-ee725a3a5094\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.579076 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7mk8\" (UniqueName: \"kubernetes.io/projected/91b5b869-275c-43f7-ad71-ee725a3a5094-kube-api-access-g7mk8\") pod \"91b5b869-275c-43f7-ad71-ee725a3a5094\" (UID: \"91b5b869-275c-43f7-ad71-ee725a3a5094\") " Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.580205 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-utilities" (OuterVolumeSpecName: "utilities") pod "91b5b869-275c-43f7-ad71-ee725a3a5094" (UID: "91b5b869-275c-43f7-ad71-ee725a3a5094"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.588551 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b5b869-275c-43f7-ad71-ee725a3a5094-kube-api-access-g7mk8" (OuterVolumeSpecName: "kube-api-access-g7mk8") pod "91b5b869-275c-43f7-ad71-ee725a3a5094" (UID: "91b5b869-275c-43f7-ad71-ee725a3a5094"). InnerVolumeSpecName "kube-api-access-g7mk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.598954 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91b5b869-275c-43f7-ad71-ee725a3a5094" (UID: "91b5b869-275c-43f7-ad71-ee725a3a5094"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.682182 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.682241 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91b5b869-275c-43f7-ad71-ee725a3a5094-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:19 crc kubenswrapper[4958]: I1201 10:38:19.682268 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7mk8\" (UniqueName: \"kubernetes.io/projected/91b5b869-275c-43f7-ad71-ee725a3a5094-kube-api-access-g7mk8\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.071826 4958 generic.go:334] "Generic (PLEG): container finished" podID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerID="758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211" exitCode=0 Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.071940 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vhvzc" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.071951 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerDied","Data":"758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211"} Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.072214 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vhvzc" event={"ID":"91b5b869-275c-43f7-ad71-ee725a3a5094","Type":"ContainerDied","Data":"8e6b1e0c18759d7c99fdeea78a82e6e3e3cf599fcd50c98114ea04feec9e81fe"} Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.072299 4958 scope.go:117] "RemoveContainer" containerID="758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.110269 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhvzc"] Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.118175 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vhvzc"] Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.118338 4958 scope.go:117] "RemoveContainer" containerID="ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.163237 4958 scope.go:117] "RemoveContainer" containerID="35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.202937 4958 scope.go:117] "RemoveContainer" containerID="758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211" Dec 01 10:38:20 crc kubenswrapper[4958]: E1201 10:38:20.203790 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211\": container with ID starting with 758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211 not found: ID does not exist" containerID="758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.203866 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211"} err="failed to get container status \"758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211\": rpc error: code = NotFound desc = could not find container \"758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211\": container with ID starting with 758ae6943cb6c869fd817f3be9f5f030b10dd251a42ef1b175507aaa0dcf0211 not found: ID does not exist" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.203934 4958 scope.go:117] "RemoveContainer" containerID="ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253" Dec 01 10:38:20 crc kubenswrapper[4958]: E1201 10:38:20.204903 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253\": container with ID starting with ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253 not found: ID does not exist" containerID="ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.205000 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253"} err="failed to get container status \"ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253\": rpc error: code = NotFound desc = could not find container \"ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253\": container with ID starting with ed1963db637c18829b27f97e172eb41c919ee0c4710730dcf1e89a0838bb5253 not found: ID does not exist" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.205066 4958 scope.go:117] "RemoveContainer" containerID="35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02" Dec 01 10:38:20 crc kubenswrapper[4958]: E1201 10:38:20.205605 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02\": container with ID starting with 35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02 not found: ID does not exist" containerID="35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.205657 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02"} err="failed to get container status \"35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02\": rpc error: code = NotFound desc = could not find container \"35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02\": container with ID starting with 35590d57d6a856c7f88d332be710e9cf53d2cf11116fd4a33eb4cc6ec6ef7b02 not found: ID does not exist" Dec 01 10:38:20 crc kubenswrapper[4958]: I1201 10:38:20.797837 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:38:20 crc kubenswrapper[4958]: E1201 10:38:20.798800 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:38:21 crc kubenswrapper[4958]: I1201 10:38:21.814883 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" path="/var/lib/kubelet/pods/91b5b869-275c-43f7-ad71-ee725a3a5094/volumes" Dec 01 10:38:34 crc kubenswrapper[4958]: I1201 10:38:34.797988 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:38:34 crc kubenswrapper[4958]: E1201 10:38:34.798754 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:38:49 crc kubenswrapper[4958]: I1201 10:38:49.801042 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:38:49 crc kubenswrapper[4958]: E1201 10:38:49.803029 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:39:03 crc kubenswrapper[4958]: I1201 10:39:03.805561 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:39:03 crc kubenswrapper[4958]: E1201 10:39:03.806895 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:39:14 crc kubenswrapper[4958]: I1201 10:39:14.798389 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:39:14 crc kubenswrapper[4958]: E1201 10:39:14.803063 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:39:28 crc kubenswrapper[4958]: I1201 10:39:28.798329 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:39:28 crc kubenswrapper[4958]: E1201 10:39:28.799409 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:39:43 crc kubenswrapper[4958]: I1201 10:39:43.803013 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:39:43 crc kubenswrapper[4958]: E1201 10:39:43.806107 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:39:57 crc kubenswrapper[4958]: I1201 10:39:57.797463 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:39:57 crc kubenswrapper[4958]: E1201 10:39:57.798206 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:40:12 crc kubenswrapper[4958]: I1201 10:40:12.797593 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:40:12 crc kubenswrapper[4958]: E1201 10:40:12.798468 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:40:23 crc kubenswrapper[4958]: I1201 10:40:23.801098 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:40:23 crc kubenswrapper[4958]: E1201 10:40:23.801737 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:40:34 crc kubenswrapper[4958]: I1201 10:40:34.798244 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:40:34 crc kubenswrapper[4958]: E1201 10:40:34.801030 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:40:48 crc kubenswrapper[4958]: I1201 10:40:48.798454 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:40:48 crc kubenswrapper[4958]: E1201 10:40:48.799528 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:41:00 crc kubenswrapper[4958]: I1201 10:41:00.798129 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:41:00 crc kubenswrapper[4958]: E1201 10:41:00.798807 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:41:11 crc kubenswrapper[4958]: I1201 10:41:11.797878 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:41:11 crc kubenswrapper[4958]: E1201 10:41:11.798685 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:41:23 crc kubenswrapper[4958]: I1201 10:41:23.803427 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:41:23 crc kubenswrapper[4958]: E1201 10:41:23.809329 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:41:36 crc kubenswrapper[4958]: I1201 10:41:36.798640 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:41:36 crc kubenswrapper[4958]: E1201 10:41:36.799685 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:41:48 crc kubenswrapper[4958]: I1201 10:41:48.797873 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:41:48 crc kubenswrapper[4958]: E1201 10:41:48.799265 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:42:00 crc kubenswrapper[4958]: I1201 10:42:00.797987 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:42:01 crc kubenswrapper[4958]: I1201 10:42:01.258365 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"a26b199ae9ae8499211a10335d1de1af24fb3bf49c890ed8b481857c5fc329c6"} Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.321614 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86s57"] Dec 01 10:43:49 crc kubenswrapper[4958]: E1201 10:43:49.324245 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="extract-utilities" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.324387 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="extract-utilities" Dec 01 10:43:49 crc kubenswrapper[4958]: E1201 10:43:49.324485 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="extract-content" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.324564 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="extract-content" Dec 01 10:43:49 crc kubenswrapper[4958]: E1201 10:43:49.324763 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="registry-server" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.324873 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="registry-server" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.325542 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b5b869-275c-43f7-ad71-ee725a3a5094" containerName="registry-server" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.331184 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.340948 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86s57"] Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.418927 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-catalog-content\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.419009 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-utilities\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.419050 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxc7h\" (UniqueName: \"kubernetes.io/projected/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-kube-api-access-nxc7h\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.519962 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxc7h\" (UniqueName: \"kubernetes.io/projected/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-kube-api-access-nxc7h\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.520088 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-catalog-content\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.520131 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-utilities\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.520784 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-utilities\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.521136 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-catalog-content\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.543584 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxc7h\" (UniqueName: \"kubernetes.io/projected/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-kube-api-access-nxc7h\") pod \"redhat-operators-86s57\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:49 crc kubenswrapper[4958]: I1201 10:43:49.663861 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:50 crc kubenswrapper[4958]: I1201 10:43:50.141395 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86s57"] Dec 01 10:43:50 crc kubenswrapper[4958]: I1201 10:43:50.435984 4958 generic.go:334] "Generic (PLEG): container finished" podID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerID="f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea" exitCode=0 Dec 01 10:43:50 crc kubenswrapper[4958]: I1201 10:43:50.436086 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerDied","Data":"f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea"} Dec 01 10:43:50 crc kubenswrapper[4958]: I1201 10:43:50.436403 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerStarted","Data":"1df60f0490c76c4f014f932eaf020e67165d76ef35ea5d0b288481c73f4b6fcf"} Dec 01 10:43:50 crc kubenswrapper[4958]: I1201 10:43:50.437978 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:43:51 crc kubenswrapper[4958]: I1201 10:43:51.449255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerStarted","Data":"f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b"} Dec 01 10:43:53 crc kubenswrapper[4958]: I1201 10:43:53.494604 4958 generic.go:334] "Generic (PLEG): container finished" podID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerID="f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b" exitCode=0 Dec 01 10:43:53 crc kubenswrapper[4958]: I1201 10:43:53.494687 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerDied","Data":"f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b"} Dec 01 10:43:55 crc kubenswrapper[4958]: I1201 10:43:55.521934 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerStarted","Data":"65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc"} Dec 01 10:43:55 crc kubenswrapper[4958]: I1201 10:43:55.558365 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86s57" podStartSLOduration=2.370131445 podStartE2EDuration="6.558328991s" podCreationTimestamp="2025-12-01 10:43:49 +0000 UTC" firstStartedPulling="2025-12-01 10:43:50.437739591 +0000 UTC m=+2677.946528628" lastFinishedPulling="2025-12-01 10:43:54.625937127 +0000 UTC m=+2682.134726174" observedRunningTime="2025-12-01 10:43:55.55478081 +0000 UTC m=+2683.063569857" watchObservedRunningTime="2025-12-01 10:43:55.558328991 +0000 UTC m=+2683.067118038" Dec 01 10:43:59 crc kubenswrapper[4958]: I1201 10:43:59.665245 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:43:59 crc kubenswrapper[4958]: I1201 10:43:59.665698 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:44:00 crc kubenswrapper[4958]: I1201 10:44:00.708651 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-86s57" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="registry-server" probeResult="failure" output=< Dec 01 10:44:00 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 10:44:00 crc kubenswrapper[4958]: > Dec 01 10:44:09 crc kubenswrapper[4958]: I1201 10:44:09.727095 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:44:09 crc kubenswrapper[4958]: I1201 10:44:09.791998 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:44:09 crc kubenswrapper[4958]: I1201 10:44:09.977421 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86s57"] Dec 01 10:44:11 crc kubenswrapper[4958]: I1201 10:44:11.675832 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86s57" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="registry-server" containerID="cri-o://65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc" gracePeriod=2 Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.132497 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.241478 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxc7h\" (UniqueName: \"kubernetes.io/projected/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-kube-api-access-nxc7h\") pod \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.242136 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-utilities\") pod \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.242369 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-catalog-content\") pod \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\" (UID: \"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0\") " Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.243131 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-utilities" (OuterVolumeSpecName: "utilities") pod "13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" (UID: "13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.250503 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-kube-api-access-nxc7h" (OuterVolumeSpecName: "kube-api-access-nxc7h") pod "13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" (UID: "13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0"). InnerVolumeSpecName "kube-api-access-nxc7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.344870 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxc7h\" (UniqueName: \"kubernetes.io/projected/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-kube-api-access-nxc7h\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.345303 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.386733 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" (UID: "13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.446336 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.687746 4958 generic.go:334] "Generic (PLEG): container finished" podID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerID="65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc" exitCode=0 Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.687813 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86s57" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.687809 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerDied","Data":"65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc"} Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.688065 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86s57" event={"ID":"13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0","Type":"ContainerDied","Data":"1df60f0490c76c4f014f932eaf020e67165d76ef35ea5d0b288481c73f4b6fcf"} Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.688117 4958 scope.go:117] "RemoveContainer" containerID="65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.724373 4958 scope.go:117] "RemoveContainer" containerID="f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.746096 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86s57"] Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.747267 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86s57"] Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.757493 4958 scope.go:117] "RemoveContainer" containerID="f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.788140 4958 scope.go:117] "RemoveContainer" containerID="65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc" Dec 01 10:44:12 crc kubenswrapper[4958]: E1201 10:44:12.788863 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc\": container with ID starting with 65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc not found: ID does not exist" containerID="65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.788907 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc"} err="failed to get container status \"65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc\": rpc error: code = NotFound desc = could not find container \"65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc\": container with ID starting with 65733deff49bca6bdcba7dae24234ceda23a27d512c5c7e6afb749ef0fc111dc not found: ID does not exist" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.788937 4958 scope.go:117] "RemoveContainer" containerID="f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b" Dec 01 10:44:12 crc kubenswrapper[4958]: E1201 10:44:12.789526 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b\": container with ID starting with f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b not found: ID does not exist" containerID="f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.789557 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b"} err="failed to get container status \"f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b\": rpc error: code = NotFound desc = could not find container \"f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b\": container with ID starting with f7a68fa70dad57d643edaae5dfadb4eb0f7e72e74ad075406c73261fd5ab181b not found: ID does not exist" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.789579 4958 scope.go:117] "RemoveContainer" containerID="f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea" Dec 01 10:44:12 crc kubenswrapper[4958]: E1201 10:44:12.790025 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea\": container with ID starting with f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea not found: ID does not exist" containerID="f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea" Dec 01 10:44:12 crc kubenswrapper[4958]: I1201 10:44:12.790056 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea"} err="failed to get container status \"f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea\": rpc error: code = NotFound desc = could not find container \"f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea\": container with ID starting with f8fcdfdcd9a9160c0a3fb0b3c964f23016fb30dfbad325dbe1ec6d2cfd0ceeea not found: ID does not exist" Dec 01 10:44:13 crc kubenswrapper[4958]: I1201 10:44:13.808929 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" path="/var/lib/kubelet/pods/13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0/volumes" Dec 01 10:44:28 crc kubenswrapper[4958]: I1201 10:44:28.210346 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:44:28 crc kubenswrapper[4958]: I1201 10:44:28.210980 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:44:58 crc kubenswrapper[4958]: I1201 10:44:58.210032 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:44:58 crc kubenswrapper[4958]: I1201 10:44:58.210491 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.177368 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455"] Dec 01 10:45:00 crc kubenswrapper[4958]: E1201 10:45:00.178023 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="registry-server" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.178075 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="registry-server" Dec 01 10:45:00 crc kubenswrapper[4958]: E1201 10:45:00.178107 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="extract-content" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.178120 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="extract-content" Dec 01 10:45:00 crc kubenswrapper[4958]: E1201 10:45:00.178153 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="extract-utilities" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.178169 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="extract-utilities" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.178450 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c2f2bd-c7f8-46ec-9aa8-e4b7a619a5d0" containerName="registry-server" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.179320 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.182165 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.182220 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.195179 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455"] Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.344324 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25c13797-0e6d-44c2-a6fb-4bfb19907946-config-volume\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.344734 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25c13797-0e6d-44c2-a6fb-4bfb19907946-secret-volume\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.344882 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqkv\" (UniqueName: \"kubernetes.io/projected/25c13797-0e6d-44c2-a6fb-4bfb19907946-kube-api-access-vkqkv\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.459670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqkv\" (UniqueName: \"kubernetes.io/projected/25c13797-0e6d-44c2-a6fb-4bfb19907946-kube-api-access-vkqkv\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.459763 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25c13797-0e6d-44c2-a6fb-4bfb19907946-config-volume\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.459861 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25c13797-0e6d-44c2-a6fb-4bfb19907946-secret-volume\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.462000 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25c13797-0e6d-44c2-a6fb-4bfb19907946-config-volume\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.466645 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25c13797-0e6d-44c2-a6fb-4bfb19907946-secret-volume\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.479006 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqkv\" (UniqueName: \"kubernetes.io/projected/25c13797-0e6d-44c2-a6fb-4bfb19907946-kube-api-access-vkqkv\") pod \"collect-profiles-29409765-pr455\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.502451 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:00 crc kubenswrapper[4958]: I1201 10:45:00.751511 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455"] Dec 01 10:45:01 crc kubenswrapper[4958]: I1201 10:45:01.200769 4958 generic.go:334] "Generic (PLEG): container finished" podID="25c13797-0e6d-44c2-a6fb-4bfb19907946" containerID="1525ca706432f1241fa5f8f53918d4474da847aaf4cf78e681433c6dfd421bcf" exitCode=0 Dec 01 10:45:01 crc kubenswrapper[4958]: I1201 10:45:01.200866 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" event={"ID":"25c13797-0e6d-44c2-a6fb-4bfb19907946","Type":"ContainerDied","Data":"1525ca706432f1241fa5f8f53918d4474da847aaf4cf78e681433c6dfd421bcf"} Dec 01 10:45:01 crc kubenswrapper[4958]: I1201 10:45:01.200943 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" event={"ID":"25c13797-0e6d-44c2-a6fb-4bfb19907946","Type":"ContainerStarted","Data":"febda9fd5529a8e7c66b0e95bf75fc43d8ebe1849565511026ace43d2e38da24"} Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.557832 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.694927 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkqkv\" (UniqueName: \"kubernetes.io/projected/25c13797-0e6d-44c2-a6fb-4bfb19907946-kube-api-access-vkqkv\") pod \"25c13797-0e6d-44c2-a6fb-4bfb19907946\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.695035 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25c13797-0e6d-44c2-a6fb-4bfb19907946-config-volume\") pod \"25c13797-0e6d-44c2-a6fb-4bfb19907946\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.695090 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25c13797-0e6d-44c2-a6fb-4bfb19907946-secret-volume\") pod \"25c13797-0e6d-44c2-a6fb-4bfb19907946\" (UID: \"25c13797-0e6d-44c2-a6fb-4bfb19907946\") " Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.696085 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c13797-0e6d-44c2-a6fb-4bfb19907946-config-volume" (OuterVolumeSpecName: "config-volume") pod "25c13797-0e6d-44c2-a6fb-4bfb19907946" (UID: "25c13797-0e6d-44c2-a6fb-4bfb19907946"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.700600 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c13797-0e6d-44c2-a6fb-4bfb19907946-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "25c13797-0e6d-44c2-a6fb-4bfb19907946" (UID: "25c13797-0e6d-44c2-a6fb-4bfb19907946"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.700941 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c13797-0e6d-44c2-a6fb-4bfb19907946-kube-api-access-vkqkv" (OuterVolumeSpecName: "kube-api-access-vkqkv") pod "25c13797-0e6d-44c2-a6fb-4bfb19907946" (UID: "25c13797-0e6d-44c2-a6fb-4bfb19907946"). InnerVolumeSpecName "kube-api-access-vkqkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.796434 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkqkv\" (UniqueName: \"kubernetes.io/projected/25c13797-0e6d-44c2-a6fb-4bfb19907946-kube-api-access-vkqkv\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.796493 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25c13797-0e6d-44c2-a6fb-4bfb19907946-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:02 crc kubenswrapper[4958]: I1201 10:45:02.796505 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/25c13797-0e6d-44c2-a6fb-4bfb19907946-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4958]: I1201 10:45:03.220215 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" event={"ID":"25c13797-0e6d-44c2-a6fb-4bfb19907946","Type":"ContainerDied","Data":"febda9fd5529a8e7c66b0e95bf75fc43d8ebe1849565511026ace43d2e38da24"} Dec 01 10:45:03 crc kubenswrapper[4958]: I1201 10:45:03.220527 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="febda9fd5529a8e7c66b0e95bf75fc43d8ebe1849565511026ace43d2e38da24" Dec 01 10:45:03 crc kubenswrapper[4958]: I1201 10:45:03.220266 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455" Dec 01 10:45:03 crc kubenswrapper[4958]: I1201 10:45:03.655439 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5"] Dec 01 10:45:03 crc kubenswrapper[4958]: I1201 10:45:03.663041 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-84vv5"] Dec 01 10:45:03 crc kubenswrapper[4958]: I1201 10:45:03.809180 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="954a15b1-bcae-44b9-ad5b-e004ed4d79be" path="/var/lib/kubelet/pods/954a15b1-bcae-44b9-ad5b-e004ed4d79be/volumes" Dec 01 10:45:21 crc kubenswrapper[4958]: I1201 10:45:21.434280 4958 scope.go:117] "RemoveContainer" containerID="31900a7ee5f6e78fb0673cec346e8eee4d3a131eb91794d86e5fd79aa549e126" Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.210961 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.211303 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.211361 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.212108 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a26b199ae9ae8499211a10335d1de1af24fb3bf49c890ed8b481857c5fc329c6"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.212168 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://a26b199ae9ae8499211a10335d1de1af24fb3bf49c890ed8b481857c5fc329c6" gracePeriod=600 Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.577675 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="a26b199ae9ae8499211a10335d1de1af24fb3bf49c890ed8b481857c5fc329c6" exitCode=0 Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.577757 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"a26b199ae9ae8499211a10335d1de1af24fb3bf49c890ed8b481857c5fc329c6"} Dec 01 10:45:28 crc kubenswrapper[4958]: I1201 10:45:28.577866 4958 scope.go:117] "RemoveContainer" containerID="e1a4cf5b7bbebe8e12fce0f2767b6254a612068f5ca067443633e24898b02641" Dec 01 10:45:29 crc kubenswrapper[4958]: I1201 10:45:29.590832 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2"} Dec 01 10:47:28 crc kubenswrapper[4958]: I1201 10:47:28.210831 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:47:28 crc kubenswrapper[4958]: I1201 10:47:28.211418 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.759161 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbt64"] Dec 01 10:47:43 crc kubenswrapper[4958]: E1201 10:47:43.760355 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c13797-0e6d-44c2-a6fb-4bfb19907946" containerName="collect-profiles" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.760376 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c13797-0e6d-44c2-a6fb-4bfb19907946" containerName="collect-profiles" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.760612 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c13797-0e6d-44c2-a6fb-4bfb19907946" containerName="collect-profiles" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.764776 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.786242 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbt64"] Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.825738 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-utilities\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.826166 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-catalog-content\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.826306 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsf8\" (UniqueName: \"kubernetes.io/projected/848dbf98-bd66-46ae-840d-121b574b8e8a-kube-api-access-mfsf8\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.927030 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsf8\" (UniqueName: \"kubernetes.io/projected/848dbf98-bd66-46ae-840d-121b574b8e8a-kube-api-access-mfsf8\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.927187 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-utilities\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.927273 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-catalog-content\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.927905 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-utilities\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.927990 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-catalog-content\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:43 crc kubenswrapper[4958]: I1201 10:47:43.951572 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsf8\" (UniqueName: \"kubernetes.io/projected/848dbf98-bd66-46ae-840d-121b574b8e8a-kube-api-access-mfsf8\") pod \"certified-operators-wbt64\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:44 crc kubenswrapper[4958]: I1201 10:47:44.086956 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:44 crc kubenswrapper[4958]: I1201 10:47:44.581442 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbt64"] Dec 01 10:47:44 crc kubenswrapper[4958]: I1201 10:47:44.854291 4958 generic.go:334] "Generic (PLEG): container finished" podID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerID="d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d" exitCode=0 Dec 01 10:47:44 crc kubenswrapper[4958]: I1201 10:47:44.854360 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbt64" event={"ID":"848dbf98-bd66-46ae-840d-121b574b8e8a","Type":"ContainerDied","Data":"d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d"} Dec 01 10:47:44 crc kubenswrapper[4958]: I1201 10:47:44.854393 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbt64" event={"ID":"848dbf98-bd66-46ae-840d-121b574b8e8a","Type":"ContainerStarted","Data":"98f6720aee7e0d7c37aef667fd92661ee92ab1a5268379abefd21d5d1130b142"} Dec 01 10:47:46 crc kubenswrapper[4958]: I1201 10:47:46.874881 4958 generic.go:334] "Generic (PLEG): container finished" podID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerID="639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405" exitCode=0 Dec 01 10:47:46 crc kubenswrapper[4958]: I1201 10:47:46.874940 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbt64" event={"ID":"848dbf98-bd66-46ae-840d-121b574b8e8a","Type":"ContainerDied","Data":"639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405"} Dec 01 10:47:47 crc kubenswrapper[4958]: I1201 10:47:47.887260 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbt64" event={"ID":"848dbf98-bd66-46ae-840d-121b574b8e8a","Type":"ContainerStarted","Data":"d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae"} Dec 01 10:47:54 crc kubenswrapper[4958]: I1201 10:47:54.087670 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:54 crc kubenswrapper[4958]: I1201 10:47:54.088479 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:54 crc kubenswrapper[4958]: I1201 10:47:54.165893 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:54 crc kubenswrapper[4958]: I1201 10:47:54.205342 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbt64" podStartSLOduration=8.759933031 podStartE2EDuration="11.205319624s" podCreationTimestamp="2025-12-01 10:47:43 +0000 UTC" firstStartedPulling="2025-12-01 10:47:44.856697172 +0000 UTC m=+2912.365486209" lastFinishedPulling="2025-12-01 10:47:47.302083755 +0000 UTC m=+2914.810872802" observedRunningTime="2025-12-01 10:47:47.913196351 +0000 UTC m=+2915.421985428" watchObservedRunningTime="2025-12-01 10:47:54.205319624 +0000 UTC m=+2921.714108661" Dec 01 10:47:55 crc kubenswrapper[4958]: I1201 10:47:55.007271 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:55 crc kubenswrapper[4958]: I1201 10:47:55.911063 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5mgf2"] Dec 01 10:47:55 crc kubenswrapper[4958]: I1201 10:47:55.914068 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:55 crc kubenswrapper[4958]: I1201 10:47:55.930369 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mgf2"] Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.035147 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-catalog-content\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.036078 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-utilities\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.036577 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8nsr\" (UniqueName: \"kubernetes.io/projected/b280fffc-1859-46eb-99c9-c56df00e2144-kube-api-access-d8nsr\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.138933 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8nsr\" (UniqueName: \"kubernetes.io/projected/b280fffc-1859-46eb-99c9-c56df00e2144-kube-api-access-d8nsr\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.139046 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-catalog-content\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.139084 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-utilities\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.139696 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-utilities\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.140153 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-catalog-content\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.160321 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8nsr\" (UniqueName: \"kubernetes.io/projected/b280fffc-1859-46eb-99c9-c56df00e2144-kube-api-access-d8nsr\") pod \"community-operators-5mgf2\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.245422 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.756539 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mgf2"] Dec 01 10:47:56 crc kubenswrapper[4958]: I1201 10:47:56.978011 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerStarted","Data":"1180080f9f3ce89333a70c13456df50e14de15028891b0474aefe3beeec127f1"} Dec 01 10:47:57 crc kubenswrapper[4958]: I1201 10:47:57.986925 4958 generic.go:334] "Generic (PLEG): container finished" podID="b280fffc-1859-46eb-99c9-c56df00e2144" containerID="e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025" exitCode=0 Dec 01 10:47:57 crc kubenswrapper[4958]: I1201 10:47:57.986993 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerDied","Data":"e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025"} Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.210581 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.210667 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.305329 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbt64"] Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.305821 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbt64" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="registry-server" containerID="cri-o://d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae" gracePeriod=2 Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.767162 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.781373 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-catalog-content\") pod \"848dbf98-bd66-46ae-840d-121b574b8e8a\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.781437 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfsf8\" (UniqueName: \"kubernetes.io/projected/848dbf98-bd66-46ae-840d-121b574b8e8a-kube-api-access-mfsf8\") pod \"848dbf98-bd66-46ae-840d-121b574b8e8a\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.781517 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-utilities\") pod \"848dbf98-bd66-46ae-840d-121b574b8e8a\" (UID: \"848dbf98-bd66-46ae-840d-121b574b8e8a\") " Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.783220 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-utilities" (OuterVolumeSpecName: "utilities") pod "848dbf98-bd66-46ae-840d-121b574b8e8a" (UID: "848dbf98-bd66-46ae-840d-121b574b8e8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.797160 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848dbf98-bd66-46ae-840d-121b574b8e8a-kube-api-access-mfsf8" (OuterVolumeSpecName: "kube-api-access-mfsf8") pod "848dbf98-bd66-46ae-840d-121b574b8e8a" (UID: "848dbf98-bd66-46ae-840d-121b574b8e8a"). InnerVolumeSpecName "kube-api-access-mfsf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.853621 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "848dbf98-bd66-46ae-840d-121b574b8e8a" (UID: "848dbf98-bd66-46ae-840d-121b574b8e8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.883547 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.883624 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfsf8\" (UniqueName: \"kubernetes.io/projected/848dbf98-bd66-46ae-840d-121b574b8e8a-kube-api-access-mfsf8\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:58 crc kubenswrapper[4958]: I1201 10:47:58.883638 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848dbf98-bd66-46ae-840d-121b574b8e8a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.000433 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerStarted","Data":"df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d"} Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.006304 4958 generic.go:334] "Generic (PLEG): container finished" podID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerID="d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae" exitCode=0 Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.006385 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbt64" event={"ID":"848dbf98-bd66-46ae-840d-121b574b8e8a","Type":"ContainerDied","Data":"d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae"} Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.006410 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbt64" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.006624 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbt64" event={"ID":"848dbf98-bd66-46ae-840d-121b574b8e8a","Type":"ContainerDied","Data":"98f6720aee7e0d7c37aef667fd92661ee92ab1a5268379abefd21d5d1130b142"} Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.006641 4958 scope.go:117] "RemoveContainer" containerID="d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.033646 4958 scope.go:117] "RemoveContainer" containerID="639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.042503 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbt64"] Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.048080 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbt64"] Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.061348 4958 scope.go:117] "RemoveContainer" containerID="d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.095382 4958 scope.go:117] "RemoveContainer" containerID="d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae" Dec 01 10:47:59 crc kubenswrapper[4958]: E1201 10:47:59.098393 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae\": container with ID starting with d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae not found: ID does not exist" containerID="d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.098488 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae"} err="failed to get container status \"d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae\": rpc error: code = NotFound desc = could not find container \"d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae\": container with ID starting with d6cc03917a867d3d9a9b312195ad53212a6c8b233c90c41a94b07f583507a3ae not found: ID does not exist" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.098560 4958 scope.go:117] "RemoveContainer" containerID="639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405" Dec 01 10:47:59 crc kubenswrapper[4958]: E1201 10:47:59.099057 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405\": container with ID starting with 639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405 not found: ID does not exist" containerID="639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.099104 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405"} err="failed to get container status \"639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405\": rpc error: code = NotFound desc = could not find container \"639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405\": container with ID starting with 639883776f1986e1a5724d95b39597e681632b51a7c17cd330df9716ad64a405 not found: ID does not exist" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.099141 4958 scope.go:117] "RemoveContainer" containerID="d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d" Dec 01 10:47:59 crc kubenswrapper[4958]: E1201 10:47:59.099417 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d\": container with ID starting with d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d not found: ID does not exist" containerID="d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.099455 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d"} err="failed to get container status \"d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d\": rpc error: code = NotFound desc = could not find container \"d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d\": container with ID starting with d42c44cd4b711433943ece6cc72946347438fa622730a130306a1a4d7799d19d not found: ID does not exist" Dec 01 10:47:59 crc kubenswrapper[4958]: I1201 10:47:59.821650 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" path="/var/lib/kubelet/pods/848dbf98-bd66-46ae-840d-121b574b8e8a/volumes" Dec 01 10:48:00 crc kubenswrapper[4958]: I1201 10:48:00.026038 4958 generic.go:334] "Generic (PLEG): container finished" podID="b280fffc-1859-46eb-99c9-c56df00e2144" containerID="df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d" exitCode=0 Dec 01 10:48:00 crc kubenswrapper[4958]: I1201 10:48:00.026126 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerDied","Data":"df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d"} Dec 01 10:48:01 crc kubenswrapper[4958]: I1201 10:48:01.037286 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerStarted","Data":"d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187"} Dec 01 10:48:01 crc kubenswrapper[4958]: I1201 10:48:01.057797 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5mgf2" podStartSLOduration=3.585144545 podStartE2EDuration="6.057769252s" podCreationTimestamp="2025-12-01 10:47:55 +0000 UTC" firstStartedPulling="2025-12-01 10:47:57.98907967 +0000 UTC m=+2925.497868717" lastFinishedPulling="2025-12-01 10:48:00.461704387 +0000 UTC m=+2927.970493424" observedRunningTime="2025-12-01 10:48:01.055510778 +0000 UTC m=+2928.564299825" watchObservedRunningTime="2025-12-01 10:48:01.057769252 +0000 UTC m=+2928.566558289" Dec 01 10:48:06 crc kubenswrapper[4958]: I1201 10:48:06.246377 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:48:06 crc kubenswrapper[4958]: I1201 10:48:06.247132 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:48:06 crc kubenswrapper[4958]: I1201 10:48:06.295548 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:48:07 crc kubenswrapper[4958]: I1201 10:48:07.140333 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:48:07 crc kubenswrapper[4958]: I1201 10:48:07.189423 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5mgf2"] Dec 01 10:48:09 crc kubenswrapper[4958]: I1201 10:48:09.101589 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5mgf2" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="registry-server" containerID="cri-o://d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187" gracePeriod=2 Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.019301 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.111142 4958 generic.go:334] "Generic (PLEG): container finished" podID="b280fffc-1859-46eb-99c9-c56df00e2144" containerID="d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187" exitCode=0 Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.111201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerDied","Data":"d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187"} Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.111255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5mgf2" event={"ID":"b280fffc-1859-46eb-99c9-c56df00e2144","Type":"ContainerDied","Data":"1180080f9f3ce89333a70c13456df50e14de15028891b0474aefe3beeec127f1"} Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.111282 4958 scope.go:117] "RemoveContainer" containerID="d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.111477 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mgf2" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.136339 4958 scope.go:117] "RemoveContainer" containerID="df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.146109 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8nsr\" (UniqueName: \"kubernetes.io/projected/b280fffc-1859-46eb-99c9-c56df00e2144-kube-api-access-d8nsr\") pod \"b280fffc-1859-46eb-99c9-c56df00e2144\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.146179 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-catalog-content\") pod \"b280fffc-1859-46eb-99c9-c56df00e2144\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.146285 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-utilities\") pod \"b280fffc-1859-46eb-99c9-c56df00e2144\" (UID: \"b280fffc-1859-46eb-99c9-c56df00e2144\") " Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.148226 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-utilities" (OuterVolumeSpecName: "utilities") pod "b280fffc-1859-46eb-99c9-c56df00e2144" (UID: "b280fffc-1859-46eb-99c9-c56df00e2144"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.154026 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b280fffc-1859-46eb-99c9-c56df00e2144-kube-api-access-d8nsr" (OuterVolumeSpecName: "kube-api-access-d8nsr") pod "b280fffc-1859-46eb-99c9-c56df00e2144" (UID: "b280fffc-1859-46eb-99c9-c56df00e2144"). InnerVolumeSpecName "kube-api-access-d8nsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.161325 4958 scope.go:117] "RemoveContainer" containerID="e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.224057 4958 scope.go:117] "RemoveContainer" containerID="d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187" Dec 01 10:48:10 crc kubenswrapper[4958]: E1201 10:48:10.226995 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187\": container with ID starting with d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187 not found: ID does not exist" containerID="d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.227165 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187"} err="failed to get container status \"d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187\": rpc error: code = NotFound desc = could not find container \"d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187\": container with ID starting with d6bec1aa68e05a523cb10011c57245f6cd800ab7015b557a9cab9808d18ae187 not found: ID does not exist" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.227206 4958 scope.go:117] "RemoveContainer" containerID="df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d" Dec 01 10:48:10 crc kubenswrapper[4958]: E1201 10:48:10.227734 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d\": container with ID starting with df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d not found: ID does not exist" containerID="df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.227758 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d"} err="failed to get container status \"df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d\": rpc error: code = NotFound desc = could not find container \"df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d\": container with ID starting with df9fb8cd5403c0cef74b70126096db1965a67e7f7c3634c0baec1f3b959b269d not found: ID does not exist" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.227772 4958 scope.go:117] "RemoveContainer" containerID="e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025" Dec 01 10:48:10 crc kubenswrapper[4958]: E1201 10:48:10.228101 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025\": container with ID starting with e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025 not found: ID does not exist" containerID="e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.228122 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025"} err="failed to get container status \"e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025\": rpc error: code = NotFound desc = could not find container \"e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025\": container with ID starting with e09a49314bf912f9affa26814c33793372280379dee8d836177c520c4e395025 not found: ID does not exist" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.235865 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b280fffc-1859-46eb-99c9-c56df00e2144" (UID: "b280fffc-1859-46eb-99c9-c56df00e2144"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.247932 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.247966 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8nsr\" (UniqueName: \"kubernetes.io/projected/b280fffc-1859-46eb-99c9-c56df00e2144-kube-api-access-d8nsr\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.247979 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280fffc-1859-46eb-99c9-c56df00e2144-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.452478 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5mgf2"] Dec 01 10:48:10 crc kubenswrapper[4958]: I1201 10:48:10.459093 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5mgf2"] Dec 01 10:48:11 crc kubenswrapper[4958]: I1201 10:48:11.809192 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" path="/var/lib/kubelet/pods/b280fffc-1859-46eb-99c9-c56df00e2144/volumes" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.156459 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-snwkz"] Dec 01 10:48:13 crc kubenswrapper[4958]: E1201 10:48:13.157941 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="extract-utilities" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.158079 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="extract-utilities" Dec 01 10:48:13 crc kubenswrapper[4958]: E1201 10:48:13.158217 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="extract-content" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.158349 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="extract-content" Dec 01 10:48:13 crc kubenswrapper[4958]: E1201 10:48:13.158483 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="extract-utilities" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.158609 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="extract-utilities" Dec 01 10:48:13 crc kubenswrapper[4958]: E1201 10:48:13.158720 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="registry-server" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.158828 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="registry-server" Dec 01 10:48:13 crc kubenswrapper[4958]: E1201 10:48:13.158988 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="extract-content" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.159104 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="extract-content" Dec 01 10:48:13 crc kubenswrapper[4958]: E1201 10:48:13.159238 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="registry-server" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.159362 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="registry-server" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.159761 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b280fffc-1859-46eb-99c9-c56df00e2144" containerName="registry-server" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.159953 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="848dbf98-bd66-46ae-840d-121b574b8e8a" containerName="registry-server" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.161499 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.179489 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snwkz"] Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.203383 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-catalog-content\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.203455 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzjpd\" (UniqueName: \"kubernetes.io/projected/da4e5437-ae76-4823-84e2-98a7fa270836-kube-api-access-dzjpd\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.203489 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-utilities\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.305218 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-catalog-content\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.305292 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzjpd\" (UniqueName: \"kubernetes.io/projected/da4e5437-ae76-4823-84e2-98a7fa270836-kube-api-access-dzjpd\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.305322 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-utilities\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.305783 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-utilities\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.305798 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-catalog-content\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.325691 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzjpd\" (UniqueName: \"kubernetes.io/projected/da4e5437-ae76-4823-84e2-98a7fa270836-kube-api-access-dzjpd\") pod \"redhat-marketplace-snwkz\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.492243 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:13 crc kubenswrapper[4958]: I1201 10:48:13.971806 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snwkz"] Dec 01 10:48:14 crc kubenswrapper[4958]: I1201 10:48:14.143693 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snwkz" event={"ID":"da4e5437-ae76-4823-84e2-98a7fa270836","Type":"ContainerStarted","Data":"12b8209015e35f6fa2bd14b32be2a21f549dd36bdcc5a09f57127415db8fa0ad"} Dec 01 10:48:15 crc kubenswrapper[4958]: I1201 10:48:15.151135 4958 generic.go:334] "Generic (PLEG): container finished" podID="da4e5437-ae76-4823-84e2-98a7fa270836" containerID="9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6" exitCode=0 Dec 01 10:48:15 crc kubenswrapper[4958]: I1201 10:48:15.151183 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snwkz" event={"ID":"da4e5437-ae76-4823-84e2-98a7fa270836","Type":"ContainerDied","Data":"9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6"} Dec 01 10:48:16 crc kubenswrapper[4958]: I1201 10:48:16.165860 4958 generic.go:334] "Generic (PLEG): container finished" podID="da4e5437-ae76-4823-84e2-98a7fa270836" containerID="46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343" exitCode=0 Dec 01 10:48:16 crc kubenswrapper[4958]: I1201 10:48:16.165923 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snwkz" event={"ID":"da4e5437-ae76-4823-84e2-98a7fa270836","Type":"ContainerDied","Data":"46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343"} Dec 01 10:48:17 crc kubenswrapper[4958]: I1201 10:48:17.175303 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snwkz" event={"ID":"da4e5437-ae76-4823-84e2-98a7fa270836","Type":"ContainerStarted","Data":"9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3"} Dec 01 10:48:17 crc kubenswrapper[4958]: I1201 10:48:17.200566 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-snwkz" podStartSLOduration=2.555151753 podStartE2EDuration="4.200547324s" podCreationTimestamp="2025-12-01 10:48:13 +0000 UTC" firstStartedPulling="2025-12-01 10:48:15.155119297 +0000 UTC m=+2942.663908334" lastFinishedPulling="2025-12-01 10:48:16.800514868 +0000 UTC m=+2944.309303905" observedRunningTime="2025-12-01 10:48:17.194554014 +0000 UTC m=+2944.703343051" watchObservedRunningTime="2025-12-01 10:48:17.200547324 +0000 UTC m=+2944.709336361" Dec 01 10:48:23 crc kubenswrapper[4958]: I1201 10:48:23.493157 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:23 crc kubenswrapper[4958]: I1201 10:48:23.493701 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:23 crc kubenswrapper[4958]: I1201 10:48:23.649030 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:24 crc kubenswrapper[4958]: I1201 10:48:24.271741 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:24 crc kubenswrapper[4958]: I1201 10:48:24.318858 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snwkz"] Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.247983 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-snwkz" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="registry-server" containerID="cri-o://9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3" gracePeriod=2 Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.695237 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.799311 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-utilities\") pod \"da4e5437-ae76-4823-84e2-98a7fa270836\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.799420 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-catalog-content\") pod \"da4e5437-ae76-4823-84e2-98a7fa270836\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.800320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-utilities" (OuterVolumeSpecName: "utilities") pod "da4e5437-ae76-4823-84e2-98a7fa270836" (UID: "da4e5437-ae76-4823-84e2-98a7fa270836"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.801301 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzjpd\" (UniqueName: \"kubernetes.io/projected/da4e5437-ae76-4823-84e2-98a7fa270836-kube-api-access-dzjpd\") pod \"da4e5437-ae76-4823-84e2-98a7fa270836\" (UID: \"da4e5437-ae76-4823-84e2-98a7fa270836\") " Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.802256 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.811210 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4e5437-ae76-4823-84e2-98a7fa270836-kube-api-access-dzjpd" (OuterVolumeSpecName: "kube-api-access-dzjpd") pod "da4e5437-ae76-4823-84e2-98a7fa270836" (UID: "da4e5437-ae76-4823-84e2-98a7fa270836"). InnerVolumeSpecName "kube-api-access-dzjpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.835655 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da4e5437-ae76-4823-84e2-98a7fa270836" (UID: "da4e5437-ae76-4823-84e2-98a7fa270836"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.903120 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4e5437-ae76-4823-84e2-98a7fa270836-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:26 crc kubenswrapper[4958]: I1201 10:48:26.904189 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzjpd\" (UniqueName: \"kubernetes.io/projected/da4e5437-ae76-4823-84e2-98a7fa270836-kube-api-access-dzjpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.258926 4958 generic.go:334] "Generic (PLEG): container finished" podID="da4e5437-ae76-4823-84e2-98a7fa270836" containerID="9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3" exitCode=0 Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.259002 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snwkz" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.259003 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snwkz" event={"ID":"da4e5437-ae76-4823-84e2-98a7fa270836","Type":"ContainerDied","Data":"9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3"} Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.259099 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snwkz" event={"ID":"da4e5437-ae76-4823-84e2-98a7fa270836","Type":"ContainerDied","Data":"12b8209015e35f6fa2bd14b32be2a21f549dd36bdcc5a09f57127415db8fa0ad"} Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.259134 4958 scope.go:117] "RemoveContainer" containerID="9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.297013 4958 scope.go:117] "RemoveContainer" containerID="46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.307056 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snwkz"] Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.318824 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-snwkz"] Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.322985 4958 scope.go:117] "RemoveContainer" containerID="9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.353249 4958 scope.go:117] "RemoveContainer" containerID="9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3" Dec 01 10:48:27 crc kubenswrapper[4958]: E1201 10:48:27.353744 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3\": container with ID starting with 9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3 not found: ID does not exist" containerID="9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.353781 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3"} err="failed to get container status \"9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3\": rpc error: code = NotFound desc = could not find container \"9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3\": container with ID starting with 9552a5823c3655737e62303abbbcca6b84bcfe2e1273bbb52233970f708a0ce3 not found: ID does not exist" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.353806 4958 scope.go:117] "RemoveContainer" containerID="46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343" Dec 01 10:48:27 crc kubenswrapper[4958]: E1201 10:48:27.354169 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343\": container with ID starting with 46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343 not found: ID does not exist" containerID="46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.354194 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343"} err="failed to get container status \"46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343\": rpc error: code = NotFound desc = could not find container \"46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343\": container with ID starting with 46f850cff4511e72165628b6c0abd069f829373da5e5e24fc45c100447f2a343 not found: ID does not exist" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.354209 4958 scope.go:117] "RemoveContainer" containerID="9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6" Dec 01 10:48:27 crc kubenswrapper[4958]: E1201 10:48:27.354420 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6\": container with ID starting with 9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6 not found: ID does not exist" containerID="9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.354440 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6"} err="failed to get container status \"9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6\": rpc error: code = NotFound desc = could not find container \"9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6\": container with ID starting with 9d69d016fb12f8bea09db777b5f518bd27291cdf1d6c80ddf9813d34ee1dbbd6 not found: ID does not exist" Dec 01 10:48:27 crc kubenswrapper[4958]: I1201 10:48:27.809663 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" path="/var/lib/kubelet/pods/da4e5437-ae76-4823-84e2-98a7fa270836/volumes" Dec 01 10:48:28 crc kubenswrapper[4958]: I1201 10:48:28.210331 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:48:28 crc kubenswrapper[4958]: I1201 10:48:28.210404 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:48:28 crc kubenswrapper[4958]: I1201 10:48:28.210459 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:48:28 crc kubenswrapper[4958]: I1201 10:48:28.211105 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:48:28 crc kubenswrapper[4958]: I1201 10:48:28.211164 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" gracePeriod=600 Dec 01 10:48:28 crc kubenswrapper[4958]: E1201 10:48:28.838063 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:48:29 crc kubenswrapper[4958]: I1201 10:48:29.282504 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" exitCode=0 Dec 01 10:48:29 crc kubenswrapper[4958]: I1201 10:48:29.282559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2"} Dec 01 10:48:29 crc kubenswrapper[4958]: I1201 10:48:29.282600 4958 scope.go:117] "RemoveContainer" containerID="a26b199ae9ae8499211a10335d1de1af24fb3bf49c890ed8b481857c5fc329c6" Dec 01 10:48:29 crc kubenswrapper[4958]: I1201 10:48:29.283386 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:48:29 crc kubenswrapper[4958]: E1201 10:48:29.283624 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:48:40 crc kubenswrapper[4958]: I1201 10:48:40.798158 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:48:40 crc kubenswrapper[4958]: E1201 10:48:40.799194 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:48:53 crc kubenswrapper[4958]: I1201 10:48:53.804175 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:48:53 crc kubenswrapper[4958]: E1201 10:48:53.806108 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:49:08 crc kubenswrapper[4958]: I1201 10:49:08.798491 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:49:08 crc kubenswrapper[4958]: E1201 10:49:08.799695 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:49:23 crc kubenswrapper[4958]: I1201 10:49:23.805491 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:49:23 crc kubenswrapper[4958]: E1201 10:49:23.806664 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:49:34 crc kubenswrapper[4958]: I1201 10:49:34.798440 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:49:34 crc kubenswrapper[4958]: E1201 10:49:34.799222 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:49:45 crc kubenswrapper[4958]: I1201 10:49:45.798446 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:49:45 crc kubenswrapper[4958]: E1201 10:49:45.799436 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:49:59 crc kubenswrapper[4958]: I1201 10:49:59.798704 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:49:59 crc kubenswrapper[4958]: E1201 10:49:59.799841 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:50:10 crc kubenswrapper[4958]: I1201 10:50:10.797451 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:50:10 crc kubenswrapper[4958]: E1201 10:50:10.798101 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:50:23 crc kubenswrapper[4958]: I1201 10:50:23.807813 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:50:23 crc kubenswrapper[4958]: E1201 10:50:23.808725 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:50:34 crc kubenswrapper[4958]: I1201 10:50:34.797162 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:50:34 crc kubenswrapper[4958]: E1201 10:50:34.798240 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:50:46 crc kubenswrapper[4958]: I1201 10:50:46.797246 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:50:46 crc kubenswrapper[4958]: E1201 10:50:46.797795 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:50:57 crc kubenswrapper[4958]: I1201 10:50:57.797775 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:50:57 crc kubenswrapper[4958]: E1201 10:50:57.799714 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:51:09 crc kubenswrapper[4958]: I1201 10:51:09.798381 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:51:09 crc kubenswrapper[4958]: E1201 10:51:09.799343 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:51:22 crc kubenswrapper[4958]: I1201 10:51:22.798600 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:51:22 crc kubenswrapper[4958]: E1201 10:51:22.799521 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:51:36 crc kubenswrapper[4958]: I1201 10:51:36.800081 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:51:36 crc kubenswrapper[4958]: E1201 10:51:36.801110 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:51:48 crc kubenswrapper[4958]: I1201 10:51:48.798408 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:51:48 crc kubenswrapper[4958]: E1201 10:51:48.799637 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:52:01 crc kubenswrapper[4958]: I1201 10:52:01.800307 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:52:01 crc kubenswrapper[4958]: E1201 10:52:01.801235 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:52:13 crc kubenswrapper[4958]: I1201 10:52:13.808418 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:52:13 crc kubenswrapper[4958]: E1201 10:52:13.809810 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:52:25 crc kubenswrapper[4958]: I1201 10:52:25.798037 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:52:25 crc kubenswrapper[4958]: E1201 10:52:25.798981 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:52:39 crc kubenswrapper[4958]: I1201 10:52:39.799728 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:52:39 crc kubenswrapper[4958]: E1201 10:52:39.801001 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:52:50 crc kubenswrapper[4958]: I1201 10:52:50.797654 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:52:50 crc kubenswrapper[4958]: E1201 10:52:50.798558 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:53:03 crc kubenswrapper[4958]: I1201 10:53:03.804449 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:53:03 crc kubenswrapper[4958]: E1201 10:53:03.805410 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:53:17 crc kubenswrapper[4958]: I1201 10:53:17.798148 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:53:17 crc kubenswrapper[4958]: E1201 10:53:17.798826 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:53:30 crc kubenswrapper[4958]: I1201 10:53:30.797199 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:53:31 crc kubenswrapper[4958]: I1201 10:53:31.639016 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"de8a121e644b9e2765cd1734598fd4207dfc7217fc68214d3cecf8201618b039"} Dec 01 10:55:58 crc kubenswrapper[4958]: I1201 10:55:58.210966 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:55:58 crc kubenswrapper[4958]: I1201 10:55:58.211576 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:56:28 crc kubenswrapper[4958]: I1201 10:56:28.210974 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:56:28 crc kubenswrapper[4958]: I1201 10:56:28.211524 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.210567 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.211155 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.211237 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.213145 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de8a121e644b9e2765cd1734598fd4207dfc7217fc68214d3cecf8201618b039"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.213247 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://de8a121e644b9e2765cd1734598fd4207dfc7217fc68214d3cecf8201618b039" gracePeriod=600 Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.803506 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="de8a121e644b9e2765cd1734598fd4207dfc7217fc68214d3cecf8201618b039" exitCode=0 Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.803530 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"de8a121e644b9e2765cd1734598fd4207dfc7217fc68214d3cecf8201618b039"} Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.803915 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd"} Dec 01 10:56:58 crc kubenswrapper[4958]: I1201 10:56:58.803937 4958 scope.go:117] "RemoveContainer" containerID="a3c38cb0ec4d5f6f4b92f0e03cab8c08deb426f7ed9f12edb61ebbb0304dd8f2" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.860402 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-grkh6"] Dec 01 10:58:08 crc kubenswrapper[4958]: E1201 10:58:08.861484 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="extract-content" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.861503 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="extract-content" Dec 01 10:58:08 crc kubenswrapper[4958]: E1201 10:58:08.861527 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="registry-server" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.861537 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="registry-server" Dec 01 10:58:08 crc kubenswrapper[4958]: E1201 10:58:08.861549 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="extract-utilities" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.861557 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="extract-utilities" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.861733 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4e5437-ae76-4823-84e2-98a7fa270836" containerName="registry-server" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.863328 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:08 crc kubenswrapper[4958]: I1201 10:58:08.868439 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-grkh6"] Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.028040 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-utilities\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.028172 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5v9\" (UniqueName: \"kubernetes.io/projected/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-kube-api-access-jp5v9\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.028410 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-catalog-content\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.130589 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-utilities\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.130712 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5v9\" (UniqueName: \"kubernetes.io/projected/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-kube-api-access-jp5v9\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.130828 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-catalog-content\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.132156 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-catalog-content\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.132316 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-utilities\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.155387 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5v9\" (UniqueName: \"kubernetes.io/projected/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-kube-api-access-jp5v9\") pod \"certified-operators-grkh6\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.210272 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:09 crc kubenswrapper[4958]: I1201 10:58:09.746961 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-grkh6"] Dec 01 10:58:09 crc kubenswrapper[4958]: W1201 10:58:09.760936 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb21f37f3_f44a_4560_bed1_190ceb7b3d2f.slice/crio-8187ecde3e432bc44e2b9dc360c0667abf966e7407c829e6c0e0950527189d9f WatchSource:0}: Error finding container 8187ecde3e432bc44e2b9dc360c0667abf966e7407c829e6c0e0950527189d9f: Status 404 returned error can't find the container with id 8187ecde3e432bc44e2b9dc360c0667abf966e7407c829e6c0e0950527189d9f Dec 01 10:58:10 crc kubenswrapper[4958]: I1201 10:58:10.508129 4958 generic.go:334] "Generic (PLEG): container finished" podID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerID="9408924debddb30a0de690de21a2108fce73a837c13de05ea158e524de50cbe5" exitCode=0 Dec 01 10:58:10 crc kubenswrapper[4958]: I1201 10:58:10.508460 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerDied","Data":"9408924debddb30a0de690de21a2108fce73a837c13de05ea158e524de50cbe5"} Dec 01 10:58:10 crc kubenswrapper[4958]: I1201 10:58:10.508505 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerStarted","Data":"8187ecde3e432bc44e2b9dc360c0667abf966e7407c829e6c0e0950527189d9f"} Dec 01 10:58:10 crc kubenswrapper[4958]: I1201 10:58:10.512056 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:58:11 crc kubenswrapper[4958]: I1201 10:58:11.519182 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerStarted","Data":"fcb9fbcafcbdacaa05b7a591e11f94b960cfbbda5c01a60fd26e29ee094c625c"} Dec 01 10:58:12 crc kubenswrapper[4958]: I1201 10:58:12.529756 4958 generic.go:334] "Generic (PLEG): container finished" podID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerID="fcb9fbcafcbdacaa05b7a591e11f94b960cfbbda5c01a60fd26e29ee094c625c" exitCode=0 Dec 01 10:58:12 crc kubenswrapper[4958]: I1201 10:58:12.529797 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerDied","Data":"fcb9fbcafcbdacaa05b7a591e11f94b960cfbbda5c01a60fd26e29ee094c625c"} Dec 01 10:58:13 crc kubenswrapper[4958]: I1201 10:58:13.539693 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerStarted","Data":"e5cf95e0a49e624521ee364a8498cf71d752a1f649805a74ce705d0a32473f7d"} Dec 01 10:58:13 crc kubenswrapper[4958]: I1201 10:58:13.561213 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-grkh6" podStartSLOduration=2.720615165 podStartE2EDuration="5.561170414s" podCreationTimestamp="2025-12-01 10:58:08 +0000 UTC" firstStartedPulling="2025-12-01 10:58:10.511225947 +0000 UTC m=+3538.020015014" lastFinishedPulling="2025-12-01 10:58:13.351781186 +0000 UTC m=+3540.860570263" observedRunningTime="2025-12-01 10:58:13.559662812 +0000 UTC m=+3541.068451849" watchObservedRunningTime="2025-12-01 10:58:13.561170414 +0000 UTC m=+3541.069959451" Dec 01 10:58:19 crc kubenswrapper[4958]: I1201 10:58:19.210770 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:19 crc kubenswrapper[4958]: I1201 10:58:19.211379 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:19 crc kubenswrapper[4958]: I1201 10:58:19.253925 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:19 crc kubenswrapper[4958]: I1201 10:58:19.637236 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:19 crc kubenswrapper[4958]: I1201 10:58:19.682204 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-grkh6"] Dec 01 10:58:21 crc kubenswrapper[4958]: I1201 10:58:21.610491 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-grkh6" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="registry-server" containerID="cri-o://e5cf95e0a49e624521ee364a8498cf71d752a1f649805a74ce705d0a32473f7d" gracePeriod=2 Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.014329 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.016412 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.030340 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.151661 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-utilities\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.151823 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-catalog-content\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.152098 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8qfh\" (UniqueName: \"kubernetes.io/projected/c23bec71-7e13-42d4-9f5c-23903e86f112-kube-api-access-c8qfh\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.254087 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-utilities\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.254155 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-catalog-content\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.254230 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8qfh\" (UniqueName: \"kubernetes.io/projected/c23bec71-7e13-42d4-9f5c-23903e86f112-kube-api-access-c8qfh\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.254607 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-utilities\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.254879 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-catalog-content\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.279938 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8qfh\" (UniqueName: \"kubernetes.io/projected/c23bec71-7e13-42d4-9f5c-23903e86f112-kube-api-access-c8qfh\") pod \"community-operators-zzzc4\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.339362 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.626892 4958 generic.go:334] "Generic (PLEG): container finished" podID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerID="e5cf95e0a49e624521ee364a8498cf71d752a1f649805a74ce705d0a32473f7d" exitCode=0 Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.626948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerDied","Data":"e5cf95e0a49e624521ee364a8498cf71d752a1f649805a74ce705d0a32473f7d"} Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.653144 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.868484 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.964817 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-utilities\") pod \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.965237 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-catalog-content\") pod \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.965296 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp5v9\" (UniqueName: \"kubernetes.io/projected/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-kube-api-access-jp5v9\") pod \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\" (UID: \"b21f37f3-f44a-4560-bed1-190ceb7b3d2f\") " Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.966702 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-utilities" (OuterVolumeSpecName: "utilities") pod "b21f37f3-f44a-4560-bed1-190ceb7b3d2f" (UID: "b21f37f3-f44a-4560-bed1-190ceb7b3d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:58:22 crc kubenswrapper[4958]: I1201 10:58:22.970686 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-kube-api-access-jp5v9" (OuterVolumeSpecName: "kube-api-access-jp5v9") pod "b21f37f3-f44a-4560-bed1-190ceb7b3d2f" (UID: "b21f37f3-f44a-4560-bed1-190ceb7b3d2f"). InnerVolumeSpecName "kube-api-access-jp5v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.017106 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b21f37f3-f44a-4560-bed1-190ceb7b3d2f" (UID: "b21f37f3-f44a-4560-bed1-190ceb7b3d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.068221 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.068260 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.068275 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp5v9\" (UniqueName: \"kubernetes.io/projected/b21f37f3-f44a-4560-bed1-190ceb7b3d2f-kube-api-access-jp5v9\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.638670 4958 generic.go:334] "Generic (PLEG): container finished" podID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerID="3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99" exitCode=0 Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.638777 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzzc4" event={"ID":"c23bec71-7e13-42d4-9f5c-23903e86f112","Type":"ContainerDied","Data":"3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99"} Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.638813 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzzc4" event={"ID":"c23bec71-7e13-42d4-9f5c-23903e86f112","Type":"ContainerStarted","Data":"24dc302f3131843b918a1a72192c07697de5c055102dc64730d8a3bcf9a246c6"} Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.645629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grkh6" event={"ID":"b21f37f3-f44a-4560-bed1-190ceb7b3d2f","Type":"ContainerDied","Data":"8187ecde3e432bc44e2b9dc360c0667abf966e7407c829e6c0e0950527189d9f"} Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.645713 4958 scope.go:117] "RemoveContainer" containerID="e5cf95e0a49e624521ee364a8498cf71d752a1f649805a74ce705d0a32473f7d" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.645725 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grkh6" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.667674 4958 scope.go:117] "RemoveContainer" containerID="fcb9fbcafcbdacaa05b7a591e11f94b960cfbbda5c01a60fd26e29ee094c625c" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.706137 4958 scope.go:117] "RemoveContainer" containerID="9408924debddb30a0de690de21a2108fce73a837c13de05ea158e524de50cbe5" Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.711933 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-grkh6"] Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.720856 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-grkh6"] Dec 01 10:58:23 crc kubenswrapper[4958]: I1201 10:58:23.809110 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" path="/var/lib/kubelet/pods/b21f37f3-f44a-4560-bed1-190ceb7b3d2f/volumes" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.130024 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hspl2"] Dec 01 10:58:28 crc kubenswrapper[4958]: E1201 10:58:28.130832 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="extract-utilities" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.130868 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="extract-utilities" Dec 01 10:58:28 crc kubenswrapper[4958]: E1201 10:58:28.130888 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="extract-content" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.130896 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="extract-content" Dec 01 10:58:28 crc kubenswrapper[4958]: E1201 10:58:28.130931 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="registry-server" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.130943 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="registry-server" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.131468 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b21f37f3-f44a-4560-bed1-190ceb7b3d2f" containerName="registry-server" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.132980 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.150069 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hspl2"] Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.190164 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdrh\" (UniqueName: \"kubernetes.io/projected/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-kube-api-access-zrdrh\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.190403 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-utilities\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.190524 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-catalog-content\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.293188 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdrh\" (UniqueName: \"kubernetes.io/projected/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-kube-api-access-zrdrh\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.293262 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-utilities\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.293299 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-catalog-content\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.293783 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-catalog-content\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.294351 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-utilities\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.323797 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdrh\" (UniqueName: \"kubernetes.io/projected/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-kube-api-access-zrdrh\") pod \"redhat-marketplace-hspl2\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.455416 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.682063 4958 generic.go:334] "Generic (PLEG): container finished" podID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerID="53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471" exitCode=0 Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.682243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzzc4" event={"ID":"c23bec71-7e13-42d4-9f5c-23903e86f112","Type":"ContainerDied","Data":"53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471"} Dec 01 10:58:28 crc kubenswrapper[4958]: I1201 10:58:28.912694 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hspl2"] Dec 01 10:58:29 crc kubenswrapper[4958]: I1201 10:58:29.691758 4958 generic.go:334] "Generic (PLEG): container finished" podID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerID="053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f" exitCode=0 Dec 01 10:58:29 crc kubenswrapper[4958]: I1201 10:58:29.691821 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hspl2" event={"ID":"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff","Type":"ContainerDied","Data":"053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f"} Dec 01 10:58:29 crc kubenswrapper[4958]: I1201 10:58:29.692124 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hspl2" event={"ID":"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff","Type":"ContainerStarted","Data":"8c392323310e44ce20b31bfd04fbee48f83a976df820116c21a9626487e0f8c0"} Dec 01 10:58:29 crc kubenswrapper[4958]: I1201 10:58:29.694679 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzzc4" event={"ID":"c23bec71-7e13-42d4-9f5c-23903e86f112","Type":"ContainerStarted","Data":"aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97"} Dec 01 10:58:29 crc kubenswrapper[4958]: I1201 10:58:29.742367 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zzzc4" podStartSLOduration=3.017041017 podStartE2EDuration="8.742329998s" podCreationTimestamp="2025-12-01 10:58:21 +0000 UTC" firstStartedPulling="2025-12-01 10:58:23.642309081 +0000 UTC m=+3551.151098128" lastFinishedPulling="2025-12-01 10:58:29.367598062 +0000 UTC m=+3556.876387109" observedRunningTime="2025-12-01 10:58:29.741592577 +0000 UTC m=+3557.250381624" watchObservedRunningTime="2025-12-01 10:58:29.742329998 +0000 UTC m=+3557.251119045" Dec 01 10:58:30 crc kubenswrapper[4958]: I1201 10:58:30.705975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hspl2" event={"ID":"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff","Type":"ContainerDied","Data":"98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03"} Dec 01 10:58:30 crc kubenswrapper[4958]: I1201 10:58:30.705907 4958 generic.go:334] "Generic (PLEG): container finished" podID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerID="98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03" exitCode=0 Dec 01 10:58:31 crc kubenswrapper[4958]: I1201 10:58:31.716413 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hspl2" event={"ID":"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff","Type":"ContainerStarted","Data":"729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622"} Dec 01 10:58:31 crc kubenswrapper[4958]: I1201 10:58:31.756465 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hspl2" podStartSLOduration=2.208399797 podStartE2EDuration="3.756438106s" podCreationTimestamp="2025-12-01 10:58:28 +0000 UTC" firstStartedPulling="2025-12-01 10:58:29.693593825 +0000 UTC m=+3557.202382862" lastFinishedPulling="2025-12-01 10:58:31.241632134 +0000 UTC m=+3558.750421171" observedRunningTime="2025-12-01 10:58:31.748382689 +0000 UTC m=+3559.257171746" watchObservedRunningTime="2025-12-01 10:58:31.756438106 +0000 UTC m=+3559.265227143" Dec 01 10:58:32 crc kubenswrapper[4958]: I1201 10:58:32.339811 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:32 crc kubenswrapper[4958]: I1201 10:58:32.339987 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:32 crc kubenswrapper[4958]: I1201 10:58:32.396712 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:38 crc kubenswrapper[4958]: I1201 10:58:38.455814 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:38 crc kubenswrapper[4958]: I1201 10:58:38.456239 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:38 crc kubenswrapper[4958]: I1201 10:58:38.524657 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:38 crc kubenswrapper[4958]: I1201 10:58:38.829080 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:39 crc kubenswrapper[4958]: I1201 10:58:39.766136 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hspl2"] Dec 01 10:58:40 crc kubenswrapper[4958]: I1201 10:58:40.786271 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hspl2" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="registry-server" containerID="cri-o://729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622" gracePeriod=2 Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.773312 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.800750 4958 generic.go:334] "Generic (PLEG): container finished" podID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerID="729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622" exitCode=0 Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.800917 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hspl2" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.826626 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hspl2" event={"ID":"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff","Type":"ContainerDied","Data":"729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622"} Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.826801 4958 scope.go:117] "RemoveContainer" containerID="729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.826870 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hspl2" event={"ID":"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff","Type":"ContainerDied","Data":"8c392323310e44ce20b31bfd04fbee48f83a976df820116c21a9626487e0f8c0"} Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.832995 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrdrh\" (UniqueName: \"kubernetes.io/projected/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-kube-api-access-zrdrh\") pod \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.833130 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-utilities\") pod \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.833254 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-catalog-content\") pod \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\" (UID: \"94f9e2cf-2eeb-4540-8ba6-cc135580b1ff\") " Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.836586 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-utilities" (OuterVolumeSpecName: "utilities") pod "94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" (UID: "94f9e2cf-2eeb-4540-8ba6-cc135580b1ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.842101 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-kube-api-access-zrdrh" (OuterVolumeSpecName: "kube-api-access-zrdrh") pod "94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" (UID: "94f9e2cf-2eeb-4540-8ba6-cc135580b1ff"). InnerVolumeSpecName "kube-api-access-zrdrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.849034 4958 scope.go:117] "RemoveContainer" containerID="98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.852795 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" (UID: "94f9e2cf-2eeb-4540-8ba6-cc135580b1ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.887161 4958 scope.go:117] "RemoveContainer" containerID="053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.916633 4958 scope.go:117] "RemoveContainer" containerID="729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622" Dec 01 10:58:41 crc kubenswrapper[4958]: E1201 10:58:41.917341 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622\": container with ID starting with 729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622 not found: ID does not exist" containerID="729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.917395 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622"} err="failed to get container status \"729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622\": rpc error: code = NotFound desc = could not find container \"729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622\": container with ID starting with 729e2a48f19016ab5e26ac929b5af094dd5ac26a21259bea9052bdc97147b622 not found: ID does not exist" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.917419 4958 scope.go:117] "RemoveContainer" containerID="98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03" Dec 01 10:58:41 crc kubenswrapper[4958]: E1201 10:58:41.917913 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03\": container with ID starting with 98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03 not found: ID does not exist" containerID="98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.917971 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03"} err="failed to get container status \"98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03\": rpc error: code = NotFound desc = could not find container \"98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03\": container with ID starting with 98ed238fb3f98194e9d5b6de6d7ca098a1f9b7bcd3db5b2f886446150de69f03 not found: ID does not exist" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.918006 4958 scope.go:117] "RemoveContainer" containerID="053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f" Dec 01 10:58:41 crc kubenswrapper[4958]: E1201 10:58:41.918358 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f\": container with ID starting with 053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f not found: ID does not exist" containerID="053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.918387 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f"} err="failed to get container status \"053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f\": rpc error: code = NotFound desc = could not find container \"053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f\": container with ID starting with 053538ea22da2354519b9de6f2251dde0748f91028d0281e04367c0612b1808f not found: ID does not exist" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.934548 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.934576 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:41 crc kubenswrapper[4958]: I1201 10:58:41.934588 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrdrh\" (UniqueName: \"kubernetes.io/projected/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff-kube-api-access-zrdrh\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:42 crc kubenswrapper[4958]: I1201 10:58:42.157493 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hspl2"] Dec 01 10:58:42 crc kubenswrapper[4958]: I1201 10:58:42.171758 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hspl2"] Dec 01 10:58:42 crc kubenswrapper[4958]: I1201 10:58:42.421467 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 10:58:43 crc kubenswrapper[4958]: I1201 10:58:43.812756 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" path="/var/lib/kubelet/pods/94f9e2cf-2eeb-4540-8ba6-cc135580b1ff/volumes" Dec 01 10:58:44 crc kubenswrapper[4958]: I1201 10:58:44.416710 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 10:58:44 crc kubenswrapper[4958]: I1201 10:58:44.756938 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:58:44 crc kubenswrapper[4958]: I1201 10:58:44.757255 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b8j4b" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="registry-server" containerID="cri-o://1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6" gracePeriod=2 Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.343784 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.421928 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-utilities\") pod \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.422341 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-catalog-content\") pod \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.422434 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9v8w\" (UniqueName: \"kubernetes.io/projected/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-kube-api-access-l9v8w\") pod \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\" (UID: \"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8\") " Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.422381 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-utilities" (OuterVolumeSpecName: "utilities") pod "a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" (UID: "a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.428804 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-kube-api-access-l9v8w" (OuterVolumeSpecName: "kube-api-access-l9v8w") pod "a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" (UID: "a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8"). InnerVolumeSpecName "kube-api-access-l9v8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.466887 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" (UID: "a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.524758 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9v8w\" (UniqueName: \"kubernetes.io/projected/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-kube-api-access-l9v8w\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.524795 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.524813 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.846697 4958 generic.go:334] "Generic (PLEG): container finished" podID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerID="1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6" exitCode=0 Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.846758 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerDied","Data":"1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6"} Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.847085 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b8j4b" event={"ID":"a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8","Type":"ContainerDied","Data":"b5c93745c178894077d69b3186d47b8772925e7c2dce799147171b44295f70ca"} Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.847154 4958 scope.go:117] "RemoveContainer" containerID="1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.846791 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b8j4b" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.880325 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.880406 4958 scope.go:117] "RemoveContainer" containerID="a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.886076 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b8j4b"] Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.906618 4958 scope.go:117] "RemoveContainer" containerID="8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.923889 4958 scope.go:117] "RemoveContainer" containerID="1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6" Dec 01 10:58:45 crc kubenswrapper[4958]: E1201 10:58:45.924390 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6\": container with ID starting with 1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6 not found: ID does not exist" containerID="1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.924462 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6"} err="failed to get container status \"1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6\": rpc error: code = NotFound desc = could not find container \"1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6\": container with ID starting with 1ada15a2e534223ffe15bca166fcb82aab8d0d2963e76d384f0ad86a24088bc6 not found: ID does not exist" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.924503 4958 scope.go:117] "RemoveContainer" containerID="a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03" Dec 01 10:58:45 crc kubenswrapper[4958]: E1201 10:58:45.924923 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03\": container with ID starting with a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03 not found: ID does not exist" containerID="a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.924957 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03"} err="failed to get container status \"a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03\": rpc error: code = NotFound desc = could not find container \"a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03\": container with ID starting with a3a651038949630ce36d2d1bddac361ecc50e5b397ca663c127e87cf83f93b03 not found: ID does not exist" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.924982 4958 scope.go:117] "RemoveContainer" containerID="8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a" Dec 01 10:58:45 crc kubenswrapper[4958]: E1201 10:58:45.925231 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a\": container with ID starting with 8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a not found: ID does not exist" containerID="8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a" Dec 01 10:58:45 crc kubenswrapper[4958]: I1201 10:58:45.925257 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a"} err="failed to get container status \"8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a\": rpc error: code = NotFound desc = could not find container \"8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a\": container with ID starting with 8b142ccc517aa4ec0f0f1003dc66c48f8dab595ee9bb00a51e6e5442fcb0505a not found: ID does not exist" Dec 01 10:58:47 crc kubenswrapper[4958]: I1201 10:58:47.818068 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" path="/var/lib/kubelet/pods/a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8/volumes" Dec 01 10:58:58 crc kubenswrapper[4958]: I1201 10:58:58.210418 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:58:58 crc kubenswrapper[4958]: I1201 10:58:58.211168 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:59:28 crc kubenswrapper[4958]: I1201 10:59:28.210425 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:59:28 crc kubenswrapper[4958]: I1201 10:59:28.211003 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.210569 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.211807 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.211967 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.213370 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.213522 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" gracePeriod=600 Dec 01 10:59:58 crc kubenswrapper[4958]: E1201 10:59:58.343301 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.524547 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" exitCode=0 Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.524603 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd"} Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.524872 4958 scope.go:117] "RemoveContainer" containerID="de8a121e644b9e2765cd1734598fd4207dfc7217fc68214d3cecf8201618b039" Dec 01 10:59:58 crc kubenswrapper[4958]: I1201 10:59:58.525426 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 10:59:58 crc kubenswrapper[4958]: E1201 10:59:58.525673 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.178806 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq"] Dec 01 11:00:00 crc kubenswrapper[4958]: E1201 11:00:00.179408 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="extract-content" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.179440 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="extract-content" Dec 01 11:00:00 crc kubenswrapper[4958]: E1201 11:00:00.179479 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="registry-server" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.179496 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="registry-server" Dec 01 11:00:00 crc kubenswrapper[4958]: E1201 11:00:00.179525 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="extract-utilities" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.179543 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="extract-utilities" Dec 01 11:00:00 crc kubenswrapper[4958]: E1201 11:00:00.179587 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="extract-utilities" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.179603 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="extract-utilities" Dec 01 11:00:00 crc kubenswrapper[4958]: E1201 11:00:00.179646 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="registry-server" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.179668 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="registry-server" Dec 01 11:00:00 crc kubenswrapper[4958]: E1201 11:00:00.179691 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="extract-content" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.179706 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="extract-content" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.180099 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f301b7-d1c9-4217-a1fe-50bc7f3f37e8" containerName="registry-server" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.180153 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f9e2cf-2eeb-4540-8ba6-cc135580b1ff" containerName="registry-server" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.182325 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.186551 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.187368 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.189890 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq"] Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.274265 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk594\" (UniqueName: \"kubernetes.io/projected/47a4de06-5133-4aa6-8fdd-5c6f803cac93-kube-api-access-gk594\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.274355 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a4de06-5133-4aa6-8fdd-5c6f803cac93-secret-volume\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.274592 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a4de06-5133-4aa6-8fdd-5c6f803cac93-config-volume\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.375974 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a4de06-5133-4aa6-8fdd-5c6f803cac93-config-volume\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.376101 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk594\" (UniqueName: \"kubernetes.io/projected/47a4de06-5133-4aa6-8fdd-5c6f803cac93-kube-api-access-gk594\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.376179 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a4de06-5133-4aa6-8fdd-5c6f803cac93-secret-volume\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.380473 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a4de06-5133-4aa6-8fdd-5c6f803cac93-config-volume\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.391657 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a4de06-5133-4aa6-8fdd-5c6f803cac93-secret-volume\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.414063 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk594\" (UniqueName: \"kubernetes.io/projected/47a4de06-5133-4aa6-8fdd-5c6f803cac93-kube-api-access-gk594\") pod \"collect-profiles-29409780-csmkq\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.512236 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:00 crc kubenswrapper[4958]: I1201 11:00:00.776769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq"] Dec 01 11:00:01 crc kubenswrapper[4958]: I1201 11:00:01.561772 4958 generic.go:334] "Generic (PLEG): container finished" podID="47a4de06-5133-4aa6-8fdd-5c6f803cac93" containerID="36953d4da1f3605db037fee336308e72a645085b1702ece2a073a1527b17abba" exitCode=0 Dec 01 11:00:01 crc kubenswrapper[4958]: I1201 11:00:01.561869 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" event={"ID":"47a4de06-5133-4aa6-8fdd-5c6f803cac93","Type":"ContainerDied","Data":"36953d4da1f3605db037fee336308e72a645085b1702ece2a073a1527b17abba"} Dec 01 11:00:01 crc kubenswrapper[4958]: I1201 11:00:01.563451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" event={"ID":"47a4de06-5133-4aa6-8fdd-5c6f803cac93","Type":"ContainerStarted","Data":"9319e94d5ed529ea64923732274dbcd4dc30efafa4f0abaf2bfcf18e406bddd4"} Dec 01 11:00:02 crc kubenswrapper[4958]: I1201 11:00:02.939512 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.015103 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a4de06-5133-4aa6-8fdd-5c6f803cac93-config-volume\") pod \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.015258 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a4de06-5133-4aa6-8fdd-5c6f803cac93-secret-volume\") pod \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.015280 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk594\" (UniqueName: \"kubernetes.io/projected/47a4de06-5133-4aa6-8fdd-5c6f803cac93-kube-api-access-gk594\") pod \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\" (UID: \"47a4de06-5133-4aa6-8fdd-5c6f803cac93\") " Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.016538 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47a4de06-5133-4aa6-8fdd-5c6f803cac93-config-volume" (OuterVolumeSpecName: "config-volume") pod "47a4de06-5133-4aa6-8fdd-5c6f803cac93" (UID: "47a4de06-5133-4aa6-8fdd-5c6f803cac93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.021133 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a4de06-5133-4aa6-8fdd-5c6f803cac93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47a4de06-5133-4aa6-8fdd-5c6f803cac93" (UID: "47a4de06-5133-4aa6-8fdd-5c6f803cac93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.022064 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a4de06-5133-4aa6-8fdd-5c6f803cac93-kube-api-access-gk594" (OuterVolumeSpecName: "kube-api-access-gk594") pod "47a4de06-5133-4aa6-8fdd-5c6f803cac93" (UID: "47a4de06-5133-4aa6-8fdd-5c6f803cac93"). InnerVolumeSpecName "kube-api-access-gk594". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.116358 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47a4de06-5133-4aa6-8fdd-5c6f803cac93-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.116392 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47a4de06-5133-4aa6-8fdd-5c6f803cac93-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.116402 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk594\" (UniqueName: \"kubernetes.io/projected/47a4de06-5133-4aa6-8fdd-5c6f803cac93-kube-api-access-gk594\") on node \"crc\" DevicePath \"\"" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.587503 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" event={"ID":"47a4de06-5133-4aa6-8fdd-5c6f803cac93","Type":"ContainerDied","Data":"9319e94d5ed529ea64923732274dbcd4dc30efafa4f0abaf2bfcf18e406bddd4"} Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.587569 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq" Dec 01 11:00:03 crc kubenswrapper[4958]: I1201 11:00:03.587573 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9319e94d5ed529ea64923732274dbcd4dc30efafa4f0abaf2bfcf18e406bddd4" Dec 01 11:00:04 crc kubenswrapper[4958]: I1201 11:00:04.038580 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp"] Dec 01 11:00:04 crc kubenswrapper[4958]: I1201 11:00:04.046154 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-rjwtp"] Dec 01 11:00:05 crc kubenswrapper[4958]: I1201 11:00:05.813960 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195376b1-020b-42ee-8caa-6f9cefba2c55" path="/var/lib/kubelet/pods/195376b1-020b-42ee-8caa-6f9cefba2c55/volumes" Dec 01 11:00:10 crc kubenswrapper[4958]: I1201 11:00:10.798227 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:00:10 crc kubenswrapper[4958]: E1201 11:00:10.798927 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:00:21 crc kubenswrapper[4958]: I1201 11:00:21.845036 4958 scope.go:117] "RemoveContainer" containerID="a578f68bac9d99796ff49d82051d72bbdfee8c2f663053b97cd4ca98dda49375" Dec 01 11:00:22 crc kubenswrapper[4958]: I1201 11:00:22.797803 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:00:22 crc kubenswrapper[4958]: E1201 11:00:22.798724 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:00:34 crc kubenswrapper[4958]: I1201 11:00:34.798530 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:00:34 crc kubenswrapper[4958]: E1201 11:00:34.800410 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:00:48 crc kubenswrapper[4958]: I1201 11:00:48.798175 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:00:48 crc kubenswrapper[4958]: E1201 11:00:48.799352 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:01:01 crc kubenswrapper[4958]: I1201 11:01:01.798281 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:01:01 crc kubenswrapper[4958]: E1201 11:01:01.799483 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:01:13 crc kubenswrapper[4958]: I1201 11:01:13.801677 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:01:13 crc kubenswrapper[4958]: E1201 11:01:13.802353 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:01:28 crc kubenswrapper[4958]: I1201 11:01:28.797645 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:01:28 crc kubenswrapper[4958]: E1201 11:01:28.798448 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:01:43 crc kubenswrapper[4958]: I1201 11:01:43.803031 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:01:43 crc kubenswrapper[4958]: E1201 11:01:43.803719 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:01:55 crc kubenswrapper[4958]: I1201 11:01:55.797865 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:01:55 crc kubenswrapper[4958]: E1201 11:01:55.798875 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.744573 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c6khx"] Dec 01 11:02:04 crc kubenswrapper[4958]: E1201 11:02:04.745342 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a4de06-5133-4aa6-8fdd-5c6f803cac93" containerName="collect-profiles" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.745354 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a4de06-5133-4aa6-8fdd-5c6f803cac93" containerName="collect-profiles" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.745492 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a4de06-5133-4aa6-8fdd-5c6f803cac93" containerName="collect-profiles" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.746563 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.765175 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6khx"] Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.875931 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btggt\" (UniqueName: \"kubernetes.io/projected/867dd6fb-7abb-49cd-8c70-123705dd751b-kube-api-access-btggt\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.876278 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867dd6fb-7abb-49cd-8c70-123705dd751b-catalog-content\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.876443 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867dd6fb-7abb-49cd-8c70-123705dd751b-utilities\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.977959 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867dd6fb-7abb-49cd-8c70-123705dd751b-utilities\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.978096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btggt\" (UniqueName: \"kubernetes.io/projected/867dd6fb-7abb-49cd-8c70-123705dd751b-kube-api-access-btggt\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.978134 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867dd6fb-7abb-49cd-8c70-123705dd751b-catalog-content\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.978573 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/867dd6fb-7abb-49cd-8c70-123705dd751b-utilities\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:04 crc kubenswrapper[4958]: I1201 11:02:04.978724 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/867dd6fb-7abb-49cd-8c70-123705dd751b-catalog-content\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:05 crc kubenswrapper[4958]: I1201 11:02:05.002482 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btggt\" (UniqueName: \"kubernetes.io/projected/867dd6fb-7abb-49cd-8c70-123705dd751b-kube-api-access-btggt\") pod \"redhat-operators-c6khx\" (UID: \"867dd6fb-7abb-49cd-8c70-123705dd751b\") " pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:05 crc kubenswrapper[4958]: I1201 11:02:05.067157 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:05 crc kubenswrapper[4958]: I1201 11:02:05.435407 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6khx"] Dec 01 11:02:05 crc kubenswrapper[4958]: I1201 11:02:05.716472 4958 generic.go:334] "Generic (PLEG): container finished" podID="867dd6fb-7abb-49cd-8c70-123705dd751b" containerID="c6f833e7044de04fd3aa32b537d06ae9b2d07f09453268bd6e08cfec5724422c" exitCode=0 Dec 01 11:02:05 crc kubenswrapper[4958]: I1201 11:02:05.716529 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6khx" event={"ID":"867dd6fb-7abb-49cd-8c70-123705dd751b","Type":"ContainerDied","Data":"c6f833e7044de04fd3aa32b537d06ae9b2d07f09453268bd6e08cfec5724422c"} Dec 01 11:02:05 crc kubenswrapper[4958]: I1201 11:02:05.716587 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6khx" event={"ID":"867dd6fb-7abb-49cd-8c70-123705dd751b","Type":"ContainerStarted","Data":"1cf74f274b13b394e8f700eba20b810f71bce48974417114e491250af4e1c935"} Dec 01 11:02:06 crc kubenswrapper[4958]: I1201 11:02:06.798021 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:02:06 crc kubenswrapper[4958]: E1201 11:02:06.798643 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:02:14 crc kubenswrapper[4958]: I1201 11:02:14.792381 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6khx" event={"ID":"867dd6fb-7abb-49cd-8c70-123705dd751b","Type":"ContainerStarted","Data":"75244ec86506a4f96e86835668757b14764c614a6aef8a17306cf6be4a30e7e0"} Dec 01 11:02:15 crc kubenswrapper[4958]: I1201 11:02:15.801199 4958 generic.go:334] "Generic (PLEG): container finished" podID="867dd6fb-7abb-49cd-8c70-123705dd751b" containerID="75244ec86506a4f96e86835668757b14764c614a6aef8a17306cf6be4a30e7e0" exitCode=0 Dec 01 11:02:15 crc kubenswrapper[4958]: I1201 11:02:15.809794 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6khx" event={"ID":"867dd6fb-7abb-49cd-8c70-123705dd751b","Type":"ContainerDied","Data":"75244ec86506a4f96e86835668757b14764c614a6aef8a17306cf6be4a30e7e0"} Dec 01 11:02:16 crc kubenswrapper[4958]: I1201 11:02:16.814944 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6khx" event={"ID":"867dd6fb-7abb-49cd-8c70-123705dd751b","Type":"ContainerStarted","Data":"fe0f444f6b406654c4a90d720df86e5f4a033dceaa80c068fdf8f9425fc2eb00"} Dec 01 11:02:16 crc kubenswrapper[4958]: I1201 11:02:16.846010 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c6khx" podStartSLOduration=2.057040273 podStartE2EDuration="12.84598564s" podCreationTimestamp="2025-12-01 11:02:04 +0000 UTC" firstStartedPulling="2025-12-01 11:02:05.718214178 +0000 UTC m=+3773.227003215" lastFinishedPulling="2025-12-01 11:02:16.507159545 +0000 UTC m=+3784.015948582" observedRunningTime="2025-12-01 11:02:16.83497823 +0000 UTC m=+3784.343767267" watchObservedRunningTime="2025-12-01 11:02:16.84598564 +0000 UTC m=+3784.354774687" Dec 01 11:02:19 crc kubenswrapper[4958]: I1201 11:02:19.798032 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:02:19 crc kubenswrapper[4958]: E1201 11:02:19.798483 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:02:25 crc kubenswrapper[4958]: I1201 11:02:25.068125 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:25 crc kubenswrapper[4958]: I1201 11:02:25.068526 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:25 crc kubenswrapper[4958]: I1201 11:02:25.150624 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:25 crc kubenswrapper[4958]: I1201 11:02:25.927250 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c6khx" Dec 01 11:02:26 crc kubenswrapper[4958]: I1201 11:02:26.015891 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6khx"] Dec 01 11:02:26 crc kubenswrapper[4958]: I1201 11:02:26.055909 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 11:02:26 crc kubenswrapper[4958]: I1201 11:02:26.056222 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4f8qt" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="registry-server" containerID="cri-o://793ac3c60db3cdec3119f5eda9e48bc3ebf00734b7139e5434275e17240334ca" gracePeriod=2 Dec 01 11:02:27 crc kubenswrapper[4958]: I1201 11:02:27.890697 4958 generic.go:334] "Generic (PLEG): container finished" podID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerID="793ac3c60db3cdec3119f5eda9e48bc3ebf00734b7139e5434275e17240334ca" exitCode=0 Dec 01 11:02:27 crc kubenswrapper[4958]: I1201 11:02:27.890779 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerDied","Data":"793ac3c60db3cdec3119f5eda9e48bc3ebf00734b7139e5434275e17240334ca"} Dec 01 11:02:28 crc kubenswrapper[4958]: I1201 11:02:28.982076 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.045399 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-catalog-content\") pod \"590d4089-bce0-4fcf-8dec-74f7b96675ce\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.045560 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9q4j\" (UniqueName: \"kubernetes.io/projected/590d4089-bce0-4fcf-8dec-74f7b96675ce-kube-api-access-p9q4j\") pod \"590d4089-bce0-4fcf-8dec-74f7b96675ce\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.045686 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-utilities\") pod \"590d4089-bce0-4fcf-8dec-74f7b96675ce\" (UID: \"590d4089-bce0-4fcf-8dec-74f7b96675ce\") " Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.046665 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-utilities" (OuterVolumeSpecName: "utilities") pod "590d4089-bce0-4fcf-8dec-74f7b96675ce" (UID: "590d4089-bce0-4fcf-8dec-74f7b96675ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.053104 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590d4089-bce0-4fcf-8dec-74f7b96675ce-kube-api-access-p9q4j" (OuterVolumeSpecName: "kube-api-access-p9q4j") pod "590d4089-bce0-4fcf-8dec-74f7b96675ce" (UID: "590d4089-bce0-4fcf-8dec-74f7b96675ce"). InnerVolumeSpecName "kube-api-access-p9q4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.148448 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9q4j\" (UniqueName: \"kubernetes.io/projected/590d4089-bce0-4fcf-8dec-74f7b96675ce-kube-api-access-p9q4j\") on node \"crc\" DevicePath \"\"" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.148484 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.171482 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "590d4089-bce0-4fcf-8dec-74f7b96675ce" (UID: "590d4089-bce0-4fcf-8dec-74f7b96675ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.250166 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/590d4089-bce0-4fcf-8dec-74f7b96675ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.916990 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4f8qt" event={"ID":"590d4089-bce0-4fcf-8dec-74f7b96675ce","Type":"ContainerDied","Data":"51c8736d00339e7a9621ed3cdf4c794b1f8f7575e9dcf71da589383e2fc3e3d0"} Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.917045 4958 scope.go:117] "RemoveContainer" containerID="793ac3c60db3cdec3119f5eda9e48bc3ebf00734b7139e5434275e17240334ca" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.917170 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4f8qt" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.948028 4958 scope.go:117] "RemoveContainer" containerID="b2546978c9857b6f555f42f0b1c681a91522eceb55a613474c8abb4f0b8a9b7e" Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.951952 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.966488 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4f8qt"] Dec 01 11:02:29 crc kubenswrapper[4958]: I1201 11:02:29.984182 4958 scope.go:117] "RemoveContainer" containerID="fcf8311b84539ff0cd04af0c55a2f50ad32a42f2ad7a789a1a8b45de84070fd6" Dec 01 11:02:31 crc kubenswrapper[4958]: I1201 11:02:31.808688 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" path="/var/lib/kubelet/pods/590d4089-bce0-4fcf-8dec-74f7b96675ce/volumes" Dec 01 11:02:33 crc kubenswrapper[4958]: I1201 11:02:33.809477 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:02:33 crc kubenswrapper[4958]: E1201 11:02:33.809955 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:02:46 crc kubenswrapper[4958]: I1201 11:02:46.797727 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:02:46 crc kubenswrapper[4958]: E1201 11:02:46.798755 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:02:58 crc kubenswrapper[4958]: I1201 11:02:58.797790 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:02:58 crc kubenswrapper[4958]: E1201 11:02:58.798710 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:03:09 crc kubenswrapper[4958]: I1201 11:03:09.798529 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:03:09 crc kubenswrapper[4958]: E1201 11:03:09.799774 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:03:24 crc kubenswrapper[4958]: I1201 11:03:24.797682 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:03:24 crc kubenswrapper[4958]: E1201 11:03:24.798392 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:03:37 crc kubenswrapper[4958]: I1201 11:03:37.799498 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:03:37 crc kubenswrapper[4958]: E1201 11:03:37.800533 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:03:51 crc kubenswrapper[4958]: I1201 11:03:51.797544 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:03:51 crc kubenswrapper[4958]: E1201 11:03:51.798325 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:04:02 crc kubenswrapper[4958]: I1201 11:04:02.798610 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:04:02 crc kubenswrapper[4958]: E1201 11:04:02.799960 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:04:14 crc kubenswrapper[4958]: I1201 11:04:14.797483 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:04:14 crc kubenswrapper[4958]: E1201 11:04:14.798255 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:04:25 crc kubenswrapper[4958]: I1201 11:04:25.798183 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:04:25 crc kubenswrapper[4958]: E1201 11:04:25.800967 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:04:40 crc kubenswrapper[4958]: I1201 11:04:40.797220 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:04:40 crc kubenswrapper[4958]: E1201 11:04:40.798023 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:04:54 crc kubenswrapper[4958]: I1201 11:04:54.797628 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:04:54 crc kubenswrapper[4958]: E1201 11:04:54.799908 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:05:06 crc kubenswrapper[4958]: I1201 11:05:06.797001 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:05:07 crc kubenswrapper[4958]: I1201 11:05:07.456818 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"b275ae34858e3975d83bf5c812c712720b91e48e28922db3966f0b48e6cff094"} Dec 01 11:07:28 crc kubenswrapper[4958]: I1201 11:07:28.210295 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:07:28 crc kubenswrapper[4958]: I1201 11:07:28.210947 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:07:58 crc kubenswrapper[4958]: I1201 11:07:58.210724 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:07:58 crc kubenswrapper[4958]: I1201 11:07:58.211184 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.596103 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9n9x6"] Dec 01 11:08:08 crc kubenswrapper[4958]: E1201 11:08:08.597084 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="extract-utilities" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.597100 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="extract-utilities" Dec 01 11:08:08 crc kubenswrapper[4958]: E1201 11:08:08.597129 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="registry-server" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.597138 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="registry-server" Dec 01 11:08:08 crc kubenswrapper[4958]: E1201 11:08:08.597159 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="extract-content" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.597168 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="extract-content" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.597365 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="590d4089-bce0-4fcf-8dec-74f7b96675ce" containerName="registry-server" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.599412 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.617140 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9n9x6"] Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.657420 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-catalog-content\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.657639 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8d2g\" (UniqueName: \"kubernetes.io/projected/09162296-56c6-4a80-9199-e720d77d54e1-kube-api-access-g8d2g\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.657759 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-utilities\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.780210 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-utilities\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.780279 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-catalog-content\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.780381 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d2g\" (UniqueName: \"kubernetes.io/projected/09162296-56c6-4a80-9199-e720d77d54e1-kube-api-access-g8d2g\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.780896 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-utilities\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.781142 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-catalog-content\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:08 crc kubenswrapper[4958]: I1201 11:08:08.963577 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d2g\" (UniqueName: \"kubernetes.io/projected/09162296-56c6-4a80-9199-e720d77d54e1-kube-api-access-g8d2g\") pod \"certified-operators-9n9x6\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:09 crc kubenswrapper[4958]: I1201 11:08:09.232436 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:09 crc kubenswrapper[4958]: I1201 11:08:09.752868 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9n9x6"] Dec 01 11:08:10 crc kubenswrapper[4958]: I1201 11:08:10.242511 4958 generic.go:334] "Generic (PLEG): container finished" podID="09162296-56c6-4a80-9199-e720d77d54e1" containerID="c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c" exitCode=0 Dec 01 11:08:10 crc kubenswrapper[4958]: I1201 11:08:10.242577 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerDied","Data":"c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c"} Dec 01 11:08:10 crc kubenswrapper[4958]: I1201 11:08:10.242608 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerStarted","Data":"42f750689d72adf1e5bcfc1b77c7441e474b7545006d0f0c566d519682ffbf94"} Dec 01 11:08:10 crc kubenswrapper[4958]: I1201 11:08:10.245656 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:08:11 crc kubenswrapper[4958]: I1201 11:08:11.251288 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerStarted","Data":"87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6"} Dec 01 11:08:12 crc kubenswrapper[4958]: I1201 11:08:12.263203 4958 generic.go:334] "Generic (PLEG): container finished" podID="09162296-56c6-4a80-9199-e720d77d54e1" containerID="87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6" exitCode=0 Dec 01 11:08:12 crc kubenswrapper[4958]: I1201 11:08:12.263261 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerDied","Data":"87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6"} Dec 01 11:08:13 crc kubenswrapper[4958]: I1201 11:08:13.275574 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerStarted","Data":"79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c"} Dec 01 11:08:13 crc kubenswrapper[4958]: I1201 11:08:13.304994 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9n9x6" podStartSLOduration=2.553536818 podStartE2EDuration="5.304964646s" podCreationTimestamp="2025-12-01 11:08:08 +0000 UTC" firstStartedPulling="2025-12-01 11:08:10.245312044 +0000 UTC m=+4137.754101071" lastFinishedPulling="2025-12-01 11:08:12.996739862 +0000 UTC m=+4140.505528899" observedRunningTime="2025-12-01 11:08:13.297779463 +0000 UTC m=+4140.806568500" watchObservedRunningTime="2025-12-01 11:08:13.304964646 +0000 UTC m=+4140.813753683" Dec 01 11:08:19 crc kubenswrapper[4958]: I1201 11:08:19.232799 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:19 crc kubenswrapper[4958]: I1201 11:08:19.233698 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:19 crc kubenswrapper[4958]: I1201 11:08:19.292579 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:19 crc kubenswrapper[4958]: I1201 11:08:19.367368 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:19 crc kubenswrapper[4958]: I1201 11:08:19.540928 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9n9x6"] Dec 01 11:08:21 crc kubenswrapper[4958]: I1201 11:08:21.336259 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9n9x6" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="registry-server" containerID="cri-o://79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c" gracePeriod=2 Dec 01 11:08:21 crc kubenswrapper[4958]: I1201 11:08:21.926136 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.014409 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8d2g\" (UniqueName: \"kubernetes.io/projected/09162296-56c6-4a80-9199-e720d77d54e1-kube-api-access-g8d2g\") pod \"09162296-56c6-4a80-9199-e720d77d54e1\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.014486 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-catalog-content\") pod \"09162296-56c6-4a80-9199-e720d77d54e1\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.014532 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-utilities\") pod \"09162296-56c6-4a80-9199-e720d77d54e1\" (UID: \"09162296-56c6-4a80-9199-e720d77d54e1\") " Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.015708 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-utilities" (OuterVolumeSpecName: "utilities") pod "09162296-56c6-4a80-9199-e720d77d54e1" (UID: "09162296-56c6-4a80-9199-e720d77d54e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.020193 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09162296-56c6-4a80-9199-e720d77d54e1-kube-api-access-g8d2g" (OuterVolumeSpecName: "kube-api-access-g8d2g") pod "09162296-56c6-4a80-9199-e720d77d54e1" (UID: "09162296-56c6-4a80-9199-e720d77d54e1"). InnerVolumeSpecName "kube-api-access-g8d2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.068718 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09162296-56c6-4a80-9199-e720d77d54e1" (UID: "09162296-56c6-4a80-9199-e720d77d54e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.116646 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8d2g\" (UniqueName: \"kubernetes.io/projected/09162296-56c6-4a80-9199-e720d77d54e1-kube-api-access-g8d2g\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.116687 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.116698 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09162296-56c6-4a80-9199-e720d77d54e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.348909 4958 generic.go:334] "Generic (PLEG): container finished" podID="09162296-56c6-4a80-9199-e720d77d54e1" containerID="79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c" exitCode=0 Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.348972 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerDied","Data":"79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c"} Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.349014 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9n9x6" event={"ID":"09162296-56c6-4a80-9199-e720d77d54e1","Type":"ContainerDied","Data":"42f750689d72adf1e5bcfc1b77c7441e474b7545006d0f0c566d519682ffbf94"} Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.349010 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9n9x6" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.349060 4958 scope.go:117] "RemoveContainer" containerID="79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.374900 4958 scope.go:117] "RemoveContainer" containerID="87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.396113 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9n9x6"] Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.401307 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9n9x6"] Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.406875 4958 scope.go:117] "RemoveContainer" containerID="c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.431538 4958 scope.go:117] "RemoveContainer" containerID="79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c" Dec 01 11:08:22 crc kubenswrapper[4958]: E1201 11:08:22.432280 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c\": container with ID starting with 79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c not found: ID does not exist" containerID="79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.432399 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c"} err="failed to get container status \"79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c\": rpc error: code = NotFound desc = could not find container \"79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c\": container with ID starting with 79d35714d64b875f6581e89bc7521dc7135733614d1a79da51bba16caa65714c not found: ID does not exist" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.432442 4958 scope.go:117] "RemoveContainer" containerID="87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6" Dec 01 11:08:22 crc kubenswrapper[4958]: E1201 11:08:22.432762 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6\": container with ID starting with 87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6 not found: ID does not exist" containerID="87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.432793 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6"} err="failed to get container status \"87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6\": rpc error: code = NotFound desc = could not find container \"87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6\": container with ID starting with 87e6a2fa6e4a9e95e206e877c77841a4e304aeb29cc4038ef7867422fa55d3b6 not found: ID does not exist" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.432814 4958 scope.go:117] "RemoveContainer" containerID="c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c" Dec 01 11:08:22 crc kubenswrapper[4958]: E1201 11:08:22.433205 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c\": container with ID starting with c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c not found: ID does not exist" containerID="c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c" Dec 01 11:08:22 crc kubenswrapper[4958]: I1201 11:08:22.433242 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c"} err="failed to get container status \"c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c\": rpc error: code = NotFound desc = could not find container \"c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c\": container with ID starting with c38e0041aabe769ae3d65c0e00ea0f4a32c85b3d82f55c1c22f62b847bc97c8c not found: ID does not exist" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.144101 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ms8vn"] Dec 01 11:08:23 crc kubenswrapper[4958]: E1201 11:08:23.144542 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="extract-content" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.144569 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="extract-content" Dec 01 11:08:23 crc kubenswrapper[4958]: E1201 11:08:23.144592 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="registry-server" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.144605 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="registry-server" Dec 01 11:08:23 crc kubenswrapper[4958]: E1201 11:08:23.144662 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="extract-utilities" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.144676 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="extract-utilities" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.144937 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="09162296-56c6-4a80-9199-e720d77d54e1" containerName="registry-server" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.146681 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.158942 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ms8vn"] Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.232887 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6fp4\" (UniqueName: \"kubernetes.io/projected/4240fd1c-e556-4a75-95e8-a05e13b59f2d-kube-api-access-t6fp4\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.232953 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-utilities\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.233043 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-catalog-content\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.333860 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-catalog-content\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.333925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6fp4\" (UniqueName: \"kubernetes.io/projected/4240fd1c-e556-4a75-95e8-a05e13b59f2d-kube-api-access-t6fp4\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.333961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-utilities\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.334822 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-utilities\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.334885 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-catalog-content\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.563698 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6fp4\" (UniqueName: \"kubernetes.io/projected/4240fd1c-e556-4a75-95e8-a05e13b59f2d-kube-api-access-t6fp4\") pod \"community-operators-ms8vn\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.762903 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:23 crc kubenswrapper[4958]: I1201 11:08:23.830355 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09162296-56c6-4a80-9199-e720d77d54e1" path="/var/lib/kubelet/pods/09162296-56c6-4a80-9199-e720d77d54e1/volumes" Dec 01 11:08:24 crc kubenswrapper[4958]: I1201 11:08:24.223187 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ms8vn"] Dec 01 11:08:24 crc kubenswrapper[4958]: I1201 11:08:24.368127 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerStarted","Data":"a4144327f4a2ec69b0e37cf91ab56a855dfd7384d82824a55e585b9f18f0906e"} Dec 01 11:08:25 crc kubenswrapper[4958]: I1201 11:08:25.381796 4958 generic.go:334] "Generic (PLEG): container finished" podID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerID="596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd" exitCode=0 Dec 01 11:08:25 crc kubenswrapper[4958]: I1201 11:08:25.381928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerDied","Data":"596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd"} Dec 01 11:08:26 crc kubenswrapper[4958]: I1201 11:08:26.396450 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerStarted","Data":"a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2"} Dec 01 11:08:27 crc kubenswrapper[4958]: I1201 11:08:27.419604 4958 generic.go:334] "Generic (PLEG): container finished" podID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerID="a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2" exitCode=0 Dec 01 11:08:27 crc kubenswrapper[4958]: I1201 11:08:27.419727 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerDied","Data":"a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2"} Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.210360 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.210821 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.211146 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.212344 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b275ae34858e3975d83bf5c812c712720b91e48e28922db3966f0b48e6cff094"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.212690 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://b275ae34858e3975d83bf5c812c712720b91e48e28922db3966f0b48e6cff094" gracePeriod=600 Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.433228 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerStarted","Data":"d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118"} Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.439442 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="b275ae34858e3975d83bf5c812c712720b91e48e28922db3966f0b48e6cff094" exitCode=0 Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.439493 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"b275ae34858e3975d83bf5c812c712720b91e48e28922db3966f0b48e6cff094"} Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.439538 4958 scope.go:117] "RemoveContainer" containerID="e110eb7c274c3c72501edcd3479a0b21f0552b468e45fd61a6b713bd90d2edfd" Dec 01 11:08:28 crc kubenswrapper[4958]: I1201 11:08:28.451543 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ms8vn" podStartSLOduration=2.8675796350000002 podStartE2EDuration="5.451528685s" podCreationTimestamp="2025-12-01 11:08:23 +0000 UTC" firstStartedPulling="2025-12-01 11:08:25.384940217 +0000 UTC m=+4152.893729284" lastFinishedPulling="2025-12-01 11:08:27.968889287 +0000 UTC m=+4155.477678334" observedRunningTime="2025-12-01 11:08:28.450568258 +0000 UTC m=+4155.959357325" watchObservedRunningTime="2025-12-01 11:08:28.451528685 +0000 UTC m=+4155.960317722" Dec 01 11:08:29 crc kubenswrapper[4958]: I1201 11:08:29.464910 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3"} Dec 01 11:08:33 crc kubenswrapper[4958]: I1201 11:08:33.763827 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:33 crc kubenswrapper[4958]: I1201 11:08:33.764779 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:33 crc kubenswrapper[4958]: I1201 11:08:33.851650 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:34 crc kubenswrapper[4958]: I1201 11:08:34.576896 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:36 crc kubenswrapper[4958]: I1201 11:08:36.383200 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ms8vn"] Dec 01 11:08:36 crc kubenswrapper[4958]: I1201 11:08:36.522740 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ms8vn" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="registry-server" containerID="cri-o://d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118" gracePeriod=2 Dec 01 11:08:36 crc kubenswrapper[4958]: I1201 11:08:36.965800 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.161948 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6fp4\" (UniqueName: \"kubernetes.io/projected/4240fd1c-e556-4a75-95e8-a05e13b59f2d-kube-api-access-t6fp4\") pod \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.162023 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-catalog-content\") pod \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.162171 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-utilities\") pod \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\" (UID: \"4240fd1c-e556-4a75-95e8-a05e13b59f2d\") " Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.163054 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-utilities" (OuterVolumeSpecName: "utilities") pod "4240fd1c-e556-4a75-95e8-a05e13b59f2d" (UID: "4240fd1c-e556-4a75-95e8-a05e13b59f2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.199047 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4240fd1c-e556-4a75-95e8-a05e13b59f2d-kube-api-access-t6fp4" (OuterVolumeSpecName: "kube-api-access-t6fp4") pod "4240fd1c-e556-4a75-95e8-a05e13b59f2d" (UID: "4240fd1c-e556-4a75-95e8-a05e13b59f2d"). InnerVolumeSpecName "kube-api-access-t6fp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.263269 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.263521 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6fp4\" (UniqueName: \"kubernetes.io/projected/4240fd1c-e556-4a75-95e8-a05e13b59f2d-kube-api-access-t6fp4\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.281390 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4240fd1c-e556-4a75-95e8-a05e13b59f2d" (UID: "4240fd1c-e556-4a75-95e8-a05e13b59f2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.365471 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4240fd1c-e556-4a75-95e8-a05e13b59f2d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.531738 4958 generic.go:334] "Generic (PLEG): container finished" podID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerID="d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118" exitCode=0 Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.531783 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerDied","Data":"d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118"} Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.531811 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ms8vn" event={"ID":"4240fd1c-e556-4a75-95e8-a05e13b59f2d","Type":"ContainerDied","Data":"a4144327f4a2ec69b0e37cf91ab56a855dfd7384d82824a55e585b9f18f0906e"} Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.531865 4958 scope.go:117] "RemoveContainer" containerID="d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.531949 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ms8vn" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.564159 4958 scope.go:117] "RemoveContainer" containerID="a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.578552 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ms8vn"] Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.598354 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ms8vn"] Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.633731 4958 scope.go:117] "RemoveContainer" containerID="596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.655255 4958 scope.go:117] "RemoveContainer" containerID="d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118" Dec 01 11:08:37 crc kubenswrapper[4958]: E1201 11:08:37.655660 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118\": container with ID starting with d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118 not found: ID does not exist" containerID="d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.655694 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118"} err="failed to get container status \"d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118\": rpc error: code = NotFound desc = could not find container \"d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118\": container with ID starting with d1973b7adfbe20726d0e2112f6444c80156aa97ce79b9adcefb6069aa63ac118 not found: ID does not exist" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.655724 4958 scope.go:117] "RemoveContainer" containerID="a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2" Dec 01 11:08:37 crc kubenswrapper[4958]: E1201 11:08:37.656204 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2\": container with ID starting with a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2 not found: ID does not exist" containerID="a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.656430 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2"} err="failed to get container status \"a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2\": rpc error: code = NotFound desc = could not find container \"a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2\": container with ID starting with a5b43e47b33665dffacb6810d846546eaca72f5e5f7f735656595967e163b8e2 not found: ID does not exist" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.656616 4958 scope.go:117] "RemoveContainer" containerID="596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd" Dec 01 11:08:37 crc kubenswrapper[4958]: E1201 11:08:37.657224 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd\": container with ID starting with 596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd not found: ID does not exist" containerID="596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.657261 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd"} err="failed to get container status \"596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd\": rpc error: code = NotFound desc = could not find container \"596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd\": container with ID starting with 596928f71c1e148a09ecaa31e312faf5d54d2f9de23c48415dec9b58bff1f2cd not found: ID does not exist" Dec 01 11:08:37 crc kubenswrapper[4958]: I1201 11:08:37.810797 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" path="/var/lib/kubelet/pods/4240fd1c-e556-4a75-95e8-a05e13b59f2d/volumes" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.215775 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdrvd"] Dec 01 11:09:37 crc kubenswrapper[4958]: E1201 11:09:37.217053 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="extract-content" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.217077 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="extract-content" Dec 01 11:09:37 crc kubenswrapper[4958]: E1201 11:09:37.217102 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="registry-server" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.217113 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="registry-server" Dec 01 11:09:37 crc kubenswrapper[4958]: E1201 11:09:37.217138 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="extract-utilities" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.217149 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="extract-utilities" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.217403 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4240fd1c-e556-4a75-95e8-a05e13b59f2d" containerName="registry-server" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.287203 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.297025 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdrvd"] Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.399699 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-catalog-content\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.399746 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgrd8\" (UniqueName: \"kubernetes.io/projected/280569db-7e8b-43ac-acca-7481ff294121-kube-api-access-zgrd8\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.399774 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-utilities\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.502076 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-catalog-content\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.502175 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgrd8\" (UniqueName: \"kubernetes.io/projected/280569db-7e8b-43ac-acca-7481ff294121-kube-api-access-zgrd8\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.502244 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-utilities\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.502769 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-catalog-content\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.502813 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-utilities\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.523141 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgrd8\" (UniqueName: \"kubernetes.io/projected/280569db-7e8b-43ac-acca-7481ff294121-kube-api-access-zgrd8\") pod \"redhat-marketplace-vdrvd\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:37 crc kubenswrapper[4958]: I1201 11:09:37.625510 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:38 crc kubenswrapper[4958]: I1201 11:09:38.088948 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdrvd"] Dec 01 11:09:38 crc kubenswrapper[4958]: I1201 11:09:38.157583 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerStarted","Data":"cde6276baf02d50e949be6755c0dc214da311e3202741d1418b18f215ba4a194"} Dec 01 11:09:39 crc kubenswrapper[4958]: I1201 11:09:39.169740 4958 generic.go:334] "Generic (PLEG): container finished" podID="280569db-7e8b-43ac-acca-7481ff294121" containerID="ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01" exitCode=0 Dec 01 11:09:39 crc kubenswrapper[4958]: I1201 11:09:39.169792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerDied","Data":"ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01"} Dec 01 11:09:40 crc kubenswrapper[4958]: I1201 11:09:40.180805 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerStarted","Data":"6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23"} Dec 01 11:09:41 crc kubenswrapper[4958]: I1201 11:09:41.189759 4958 generic.go:334] "Generic (PLEG): container finished" podID="280569db-7e8b-43ac-acca-7481ff294121" containerID="6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23" exitCode=0 Dec 01 11:09:41 crc kubenswrapper[4958]: I1201 11:09:41.189818 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerDied","Data":"6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23"} Dec 01 11:09:42 crc kubenswrapper[4958]: I1201 11:09:42.202082 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerStarted","Data":"1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e"} Dec 01 11:09:42 crc kubenswrapper[4958]: I1201 11:09:42.231072 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdrvd" podStartSLOduration=2.7616804630000003 podStartE2EDuration="5.231049716s" podCreationTimestamp="2025-12-01 11:09:37 +0000 UTC" firstStartedPulling="2025-12-01 11:09:39.17270352 +0000 UTC m=+4226.681492567" lastFinishedPulling="2025-12-01 11:09:41.642072773 +0000 UTC m=+4229.150861820" observedRunningTime="2025-12-01 11:09:42.227800444 +0000 UTC m=+4229.736589531" watchObservedRunningTime="2025-12-01 11:09:42.231049716 +0000 UTC m=+4229.739838763" Dec 01 11:09:47 crc kubenswrapper[4958]: I1201 11:09:47.625667 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:47 crc kubenswrapper[4958]: I1201 11:09:47.626214 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:47 crc kubenswrapper[4958]: I1201 11:09:47.673274 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:48 crc kubenswrapper[4958]: I1201 11:09:48.362079 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:48 crc kubenswrapper[4958]: I1201 11:09:48.489245 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdrvd"] Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.314623 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdrvd" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="registry-server" containerID="cri-o://1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e" gracePeriod=2 Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.757528 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.943721 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-utilities\") pod \"280569db-7e8b-43ac-acca-7481ff294121\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.943900 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgrd8\" (UniqueName: \"kubernetes.io/projected/280569db-7e8b-43ac-acca-7481ff294121-kube-api-access-zgrd8\") pod \"280569db-7e8b-43ac-acca-7481ff294121\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.943971 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-catalog-content\") pod \"280569db-7e8b-43ac-acca-7481ff294121\" (UID: \"280569db-7e8b-43ac-acca-7481ff294121\") " Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.946073 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-utilities" (OuterVolumeSpecName: "utilities") pod "280569db-7e8b-43ac-acca-7481ff294121" (UID: "280569db-7e8b-43ac-acca-7481ff294121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.950358 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280569db-7e8b-43ac-acca-7481ff294121-kube-api-access-zgrd8" (OuterVolumeSpecName: "kube-api-access-zgrd8") pod "280569db-7e8b-43ac-acca-7481ff294121" (UID: "280569db-7e8b-43ac-acca-7481ff294121"). InnerVolumeSpecName "kube-api-access-zgrd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:09:50 crc kubenswrapper[4958]: I1201 11:09:50.965291 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "280569db-7e8b-43ac-acca-7481ff294121" (UID: "280569db-7e8b-43ac-acca-7481ff294121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.045341 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.045383 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgrd8\" (UniqueName: \"kubernetes.io/projected/280569db-7e8b-43ac-acca-7481ff294121-kube-api-access-zgrd8\") on node \"crc\" DevicePath \"\"" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.045398 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280569db-7e8b-43ac-acca-7481ff294121-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.326805 4958 generic.go:334] "Generic (PLEG): container finished" podID="280569db-7e8b-43ac-acca-7481ff294121" containerID="1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e" exitCode=0 Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.326859 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerDied","Data":"1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e"} Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.326908 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdrvd" event={"ID":"280569db-7e8b-43ac-acca-7481ff294121","Type":"ContainerDied","Data":"cde6276baf02d50e949be6755c0dc214da311e3202741d1418b18f215ba4a194"} Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.326965 4958 scope.go:117] "RemoveContainer" containerID="1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.326984 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdrvd" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.354078 4958 scope.go:117] "RemoveContainer" containerID="6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.379975 4958 scope.go:117] "RemoveContainer" containerID="ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.382409 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdrvd"] Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.391814 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdrvd"] Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.414468 4958 scope.go:117] "RemoveContainer" containerID="1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e" Dec 01 11:09:51 crc kubenswrapper[4958]: E1201 11:09:51.415098 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e\": container with ID starting with 1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e not found: ID does not exist" containerID="1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.415190 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e"} err="failed to get container status \"1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e\": rpc error: code = NotFound desc = could not find container \"1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e\": container with ID starting with 1883ab5ba589b9b6e20fd9dc13331aead23610623fad4633bb95f397079ff45e not found: ID does not exist" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.415221 4958 scope.go:117] "RemoveContainer" containerID="6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23" Dec 01 11:09:51 crc kubenswrapper[4958]: E1201 11:09:51.415587 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23\": container with ID starting with 6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23 not found: ID does not exist" containerID="6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.415624 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23"} err="failed to get container status \"6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23\": rpc error: code = NotFound desc = could not find container \"6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23\": container with ID starting with 6173dbfc0bef5413e624f6c481f8c5a94ff4647226a26407b8af8a431033cb23 not found: ID does not exist" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.415643 4958 scope.go:117] "RemoveContainer" containerID="ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01" Dec 01 11:09:51 crc kubenswrapper[4958]: E1201 11:09:51.415981 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01\": container with ID starting with ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01 not found: ID does not exist" containerID="ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.416038 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01"} err="failed to get container status \"ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01\": rpc error: code = NotFound desc = could not find container \"ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01\": container with ID starting with ae876ab804ff56da83a4fd4d19159c9cac5b20bee0939fa076fade4d592afc01 not found: ID does not exist" Dec 01 11:09:51 crc kubenswrapper[4958]: I1201 11:09:51.811818 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280569db-7e8b-43ac-acca-7481ff294121" path="/var/lib/kubelet/pods/280569db-7e8b-43ac-acca-7481ff294121/volumes" Dec 01 11:10:28 crc kubenswrapper[4958]: I1201 11:10:28.211247 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:10:28 crc kubenswrapper[4958]: I1201 11:10:28.212044 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:10:58 crc kubenswrapper[4958]: I1201 11:10:58.210715 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:10:58 crc kubenswrapper[4958]: I1201 11:10:58.211397 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.210214 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.210771 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.210931 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.211620 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.211685 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" gracePeriod=600 Dec 01 11:11:28 crc kubenswrapper[4958]: E1201 11:11:28.372711 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.483487 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" exitCode=0 Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.483505 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3"} Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.483597 4958 scope.go:117] "RemoveContainer" containerID="b275ae34858e3975d83bf5c812c712720b91e48e28922db3966f0b48e6cff094" Dec 01 11:11:28 crc kubenswrapper[4958]: I1201 11:11:28.484550 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:11:28 crc kubenswrapper[4958]: E1201 11:11:28.485113 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:11:39 crc kubenswrapper[4958]: I1201 11:11:39.797577 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:11:39 crc kubenswrapper[4958]: E1201 11:11:39.798981 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:11:50 crc kubenswrapper[4958]: I1201 11:11:50.797979 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:11:50 crc kubenswrapper[4958]: E1201 11:11:50.798778 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:12:05 crc kubenswrapper[4958]: I1201 11:12:05.797738 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:12:05 crc kubenswrapper[4958]: E1201 11:12:05.798702 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:12:20 crc kubenswrapper[4958]: I1201 11:12:20.797560 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:12:20 crc kubenswrapper[4958]: E1201 11:12:20.798825 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:12:31 crc kubenswrapper[4958]: I1201 11:12:31.798089 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:12:31 crc kubenswrapper[4958]: E1201 11:12:31.799168 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:12:44 crc kubenswrapper[4958]: I1201 11:12:44.798279 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:12:44 crc kubenswrapper[4958]: E1201 11:12:44.799190 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:12:58 crc kubenswrapper[4958]: I1201 11:12:58.797502 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:12:58 crc kubenswrapper[4958]: E1201 11:12:58.798242 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:13:11 crc kubenswrapper[4958]: I1201 11:13:11.797448 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:13:11 crc kubenswrapper[4958]: E1201 11:13:11.798458 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:13:22 crc kubenswrapper[4958]: I1201 11:13:22.798743 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:13:22 crc kubenswrapper[4958]: E1201 11:13:22.800537 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:13:36 crc kubenswrapper[4958]: I1201 11:13:36.797912 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:13:36 crc kubenswrapper[4958]: E1201 11:13:36.799824 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:13:50 crc kubenswrapper[4958]: I1201 11:13:50.798180 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:13:50 crc kubenswrapper[4958]: E1201 11:13:50.799237 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:14:01 crc kubenswrapper[4958]: I1201 11:14:01.797473 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:14:01 crc kubenswrapper[4958]: E1201 11:14:01.798442 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:14:12 crc kubenswrapper[4958]: I1201 11:14:12.798752 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:14:12 crc kubenswrapper[4958]: E1201 11:14:12.799776 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:14:23 crc kubenswrapper[4958]: I1201 11:14:23.808694 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:14:23 crc kubenswrapper[4958]: E1201 11:14:23.809790 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:14:36 crc kubenswrapper[4958]: I1201 11:14:36.798273 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:14:36 crc kubenswrapper[4958]: E1201 11:14:36.799303 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:14:49 crc kubenswrapper[4958]: I1201 11:14:49.797651 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:14:49 crc kubenswrapper[4958]: E1201 11:14:49.798619 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.228352 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw"] Dec 01 11:15:00 crc kubenswrapper[4958]: E1201 11:15:00.229910 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="extract-utilities" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.229933 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="extract-utilities" Dec 01 11:15:00 crc kubenswrapper[4958]: E1201 11:15:00.230008 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="registry-server" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.230021 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="registry-server" Dec 01 11:15:00 crc kubenswrapper[4958]: E1201 11:15:00.230050 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="extract-content" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.230063 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="extract-content" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.230576 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="280569db-7e8b-43ac-acca-7481ff294121" containerName="registry-server" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.231914 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.238369 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.238373 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.248141 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw"] Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.351963 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ea980-682d-4f50-911d-25b850724299-secret-volume\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.352090 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbktw\" (UniqueName: \"kubernetes.io/projected/033ea980-682d-4f50-911d-25b850724299-kube-api-access-nbktw\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.352226 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ea980-682d-4f50-911d-25b850724299-config-volume\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.453719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ea980-682d-4f50-911d-25b850724299-config-volume\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.453900 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ea980-682d-4f50-911d-25b850724299-secret-volume\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.454003 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbktw\" (UniqueName: \"kubernetes.io/projected/033ea980-682d-4f50-911d-25b850724299-kube-api-access-nbktw\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.456680 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ea980-682d-4f50-911d-25b850724299-config-volume\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.467317 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ea980-682d-4f50-911d-25b850724299-secret-volume\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.472620 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbktw\" (UniqueName: \"kubernetes.io/projected/033ea980-682d-4f50-911d-25b850724299-kube-api-access-nbktw\") pod \"collect-profiles-29409795-b44gw\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:00 crc kubenswrapper[4958]: I1201 11:15:00.575437 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:01 crc kubenswrapper[4958]: I1201 11:15:01.022732 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw"] Dec 01 11:15:01 crc kubenswrapper[4958]: I1201 11:15:01.608494 4958 generic.go:334] "Generic (PLEG): container finished" podID="033ea980-682d-4f50-911d-25b850724299" containerID="4e28f6fcc54f78af81403bfbb65944317ce64821b656b453531b90863cbfa1a7" exitCode=0 Dec 01 11:15:01 crc kubenswrapper[4958]: I1201 11:15:01.608576 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" event={"ID":"033ea980-682d-4f50-911d-25b850724299","Type":"ContainerDied","Data":"4e28f6fcc54f78af81403bfbb65944317ce64821b656b453531b90863cbfa1a7"} Dec 01 11:15:01 crc kubenswrapper[4958]: I1201 11:15:01.608808 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" event={"ID":"033ea980-682d-4f50-911d-25b850724299","Type":"ContainerStarted","Data":"82c02d94eabaff3eb0b555867a8659dda07762c0a63d6bccb2ab45e9fa6958cc"} Dec 01 11:15:02 crc kubenswrapper[4958]: I1201 11:15:02.928214 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.093276 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ea980-682d-4f50-911d-25b850724299-secret-volume\") pod \"033ea980-682d-4f50-911d-25b850724299\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.093351 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbktw\" (UniqueName: \"kubernetes.io/projected/033ea980-682d-4f50-911d-25b850724299-kube-api-access-nbktw\") pod \"033ea980-682d-4f50-911d-25b850724299\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.093420 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ea980-682d-4f50-911d-25b850724299-config-volume\") pod \"033ea980-682d-4f50-911d-25b850724299\" (UID: \"033ea980-682d-4f50-911d-25b850724299\") " Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.094678 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033ea980-682d-4f50-911d-25b850724299-config-volume" (OuterVolumeSpecName: "config-volume") pod "033ea980-682d-4f50-911d-25b850724299" (UID: "033ea980-682d-4f50-911d-25b850724299"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.100303 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033ea980-682d-4f50-911d-25b850724299-kube-api-access-nbktw" (OuterVolumeSpecName: "kube-api-access-nbktw") pod "033ea980-682d-4f50-911d-25b850724299" (UID: "033ea980-682d-4f50-911d-25b850724299"). InnerVolumeSpecName "kube-api-access-nbktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.101315 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033ea980-682d-4f50-911d-25b850724299-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "033ea980-682d-4f50-911d-25b850724299" (UID: "033ea980-682d-4f50-911d-25b850724299"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.194798 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbktw\" (UniqueName: \"kubernetes.io/projected/033ea980-682d-4f50-911d-25b850724299-kube-api-access-nbktw\") on node \"crc\" DevicePath \"\"" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.195067 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ea980-682d-4f50-911d-25b850724299-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.195169 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ea980-682d-4f50-911d-25b850724299-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.633313 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" event={"ID":"033ea980-682d-4f50-911d-25b850724299","Type":"ContainerDied","Data":"82c02d94eabaff3eb0b555867a8659dda07762c0a63d6bccb2ab45e9fa6958cc"} Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.633390 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw" Dec 01 11:15:03 crc kubenswrapper[4958]: I1201 11:15:03.633432 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82c02d94eabaff3eb0b555867a8659dda07762c0a63d6bccb2ab45e9fa6958cc" Dec 01 11:15:04 crc kubenswrapper[4958]: I1201 11:15:04.026582 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt"] Dec 01 11:15:04 crc kubenswrapper[4958]: I1201 11:15:04.035402 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-gbtnt"] Dec 01 11:15:04 crc kubenswrapper[4958]: I1201 11:15:04.798180 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:15:04 crc kubenswrapper[4958]: E1201 11:15:04.798501 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:15:05 crc kubenswrapper[4958]: I1201 11:15:05.816163 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec14779d-958a-438c-95e8-721e994a1d6d" path="/var/lib/kubelet/pods/ec14779d-958a-438c-95e8-721e994a1d6d/volumes" Dec 01 11:15:15 crc kubenswrapper[4958]: I1201 11:15:15.797412 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:15:15 crc kubenswrapper[4958]: E1201 11:15:15.798174 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:15:22 crc kubenswrapper[4958]: I1201 11:15:22.223938 4958 scope.go:117] "RemoveContainer" containerID="dbb51fdf4dcf5aa0e37ed33343a6ed85a8a48df098be737a2509423bcc49c5d2" Dec 01 11:15:30 crc kubenswrapper[4958]: I1201 11:15:30.797900 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:15:30 crc kubenswrapper[4958]: E1201 11:15:30.799028 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:15:41 crc kubenswrapper[4958]: I1201 11:15:41.797719 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:15:41 crc kubenswrapper[4958]: E1201 11:15:41.799053 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:15:55 crc kubenswrapper[4958]: I1201 11:15:55.798248 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:15:55 crc kubenswrapper[4958]: E1201 11:15:55.799341 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:16:10 crc kubenswrapper[4958]: I1201 11:16:10.797399 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:16:10 crc kubenswrapper[4958]: E1201 11:16:10.798182 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.813228 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lms7l"] Dec 01 11:16:15 crc kubenswrapper[4958]: E1201 11:16:15.814227 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033ea980-682d-4f50-911d-25b850724299" containerName="collect-profiles" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.814247 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="033ea980-682d-4f50-911d-25b850724299" containerName="collect-profiles" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.814494 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="033ea980-682d-4f50-911d-25b850724299" containerName="collect-profiles" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.815926 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.845783 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lms7l"] Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.965226 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-catalog-content\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.965506 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/d714101e-0469-4677-a54c-ef27f8c21227-kube-api-access-85vw4\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:15 crc kubenswrapper[4958]: I1201 11:16:15.965552 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-utilities\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.067126 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-utilities\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.067264 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-catalog-content\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.067359 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/d714101e-0469-4677-a54c-ef27f8c21227-kube-api-access-85vw4\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.068258 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-utilities\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.068314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-catalog-content\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.090220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/d714101e-0469-4677-a54c-ef27f8c21227-kube-api-access-85vw4\") pod \"redhat-operators-lms7l\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.150415 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:16 crc kubenswrapper[4958]: I1201 11:16:16.624265 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lms7l"] Dec 01 11:16:17 crc kubenswrapper[4958]: I1201 11:16:17.284366 4958 generic.go:334] "Generic (PLEG): container finished" podID="d714101e-0469-4677-a54c-ef27f8c21227" containerID="b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b" exitCode=0 Dec 01 11:16:17 crc kubenswrapper[4958]: I1201 11:16:17.284706 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerDied","Data":"b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b"} Dec 01 11:16:17 crc kubenswrapper[4958]: I1201 11:16:17.284737 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerStarted","Data":"e6e0a0ddf937ee1107cdb8b907b842dbef1044da9f013195df0e2d7dd77dfb61"} Dec 01 11:16:17 crc kubenswrapper[4958]: I1201 11:16:17.286429 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:16:19 crc kubenswrapper[4958]: I1201 11:16:19.317116 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerStarted","Data":"9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc"} Dec 01 11:16:20 crc kubenswrapper[4958]: I1201 11:16:20.331595 4958 generic.go:334] "Generic (PLEG): container finished" podID="d714101e-0469-4677-a54c-ef27f8c21227" containerID="9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc" exitCode=0 Dec 01 11:16:20 crc kubenswrapper[4958]: I1201 11:16:20.331652 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerDied","Data":"9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc"} Dec 01 11:16:22 crc kubenswrapper[4958]: I1201 11:16:22.373470 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerStarted","Data":"06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83"} Dec 01 11:16:22 crc kubenswrapper[4958]: I1201 11:16:22.410078 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lms7l" podStartSLOduration=3.391747477 podStartE2EDuration="7.410025068s" podCreationTimestamp="2025-12-01 11:16:15 +0000 UTC" firstStartedPulling="2025-12-01 11:16:17.28622518 +0000 UTC m=+4624.795014207" lastFinishedPulling="2025-12-01 11:16:21.304502771 +0000 UTC m=+4628.813291798" observedRunningTime="2025-12-01 11:16:22.40015742 +0000 UTC m=+4629.908946537" watchObservedRunningTime="2025-12-01 11:16:22.410025068 +0000 UTC m=+4629.918814115" Dec 01 11:16:23 crc kubenswrapper[4958]: I1201 11:16:23.804175 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:16:23 crc kubenswrapper[4958]: E1201 11:16:23.804728 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:16:26 crc kubenswrapper[4958]: I1201 11:16:26.150875 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:26 crc kubenswrapper[4958]: I1201 11:16:26.151251 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:27 crc kubenswrapper[4958]: I1201 11:16:27.230465 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lms7l" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="registry-server" probeResult="failure" output=< Dec 01 11:16:27 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:16:27 crc kubenswrapper[4958]: > Dec 01 11:16:36 crc kubenswrapper[4958]: I1201 11:16:36.208016 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:36 crc kubenswrapper[4958]: I1201 11:16:36.266982 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:36 crc kubenswrapper[4958]: I1201 11:16:36.457124 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lms7l"] Dec 01 11:16:36 crc kubenswrapper[4958]: I1201 11:16:36.798041 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:16:37 crc kubenswrapper[4958]: I1201 11:16:37.523225 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"01cdf88f930e439a7789701e601c7f4fd6042b8650b40106465e84317ace78b8"} Dec 01 11:16:37 crc kubenswrapper[4958]: I1201 11:16:37.523404 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lms7l" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="registry-server" containerID="cri-o://06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83" gracePeriod=2 Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.038876 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.233913 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/d714101e-0469-4677-a54c-ef27f8c21227-kube-api-access-85vw4\") pod \"d714101e-0469-4677-a54c-ef27f8c21227\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.234028 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-utilities\") pod \"d714101e-0469-4677-a54c-ef27f8c21227\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.234111 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-catalog-content\") pod \"d714101e-0469-4677-a54c-ef27f8c21227\" (UID: \"d714101e-0469-4677-a54c-ef27f8c21227\") " Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.236174 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-utilities" (OuterVolumeSpecName: "utilities") pod "d714101e-0469-4677-a54c-ef27f8c21227" (UID: "d714101e-0469-4677-a54c-ef27f8c21227"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.244696 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d714101e-0469-4677-a54c-ef27f8c21227-kube-api-access-85vw4" (OuterVolumeSpecName: "kube-api-access-85vw4") pod "d714101e-0469-4677-a54c-ef27f8c21227" (UID: "d714101e-0469-4677-a54c-ef27f8c21227"). InnerVolumeSpecName "kube-api-access-85vw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.335999 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85vw4\" (UniqueName: \"kubernetes.io/projected/d714101e-0469-4677-a54c-ef27f8c21227-kube-api-access-85vw4\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.336350 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.355540 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d714101e-0469-4677-a54c-ef27f8c21227" (UID: "d714101e-0469-4677-a54c-ef27f8c21227"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.437811 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d714101e-0469-4677-a54c-ef27f8c21227-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.537141 4958 generic.go:334] "Generic (PLEG): container finished" podID="d714101e-0469-4677-a54c-ef27f8c21227" containerID="06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83" exitCode=0 Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.537205 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerDied","Data":"06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83"} Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.537283 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lms7l" event={"ID":"d714101e-0469-4677-a54c-ef27f8c21227","Type":"ContainerDied","Data":"e6e0a0ddf937ee1107cdb8b907b842dbef1044da9f013195df0e2d7dd77dfb61"} Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.537299 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lms7l" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.537323 4958 scope.go:117] "RemoveContainer" containerID="06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.571022 4958 scope.go:117] "RemoveContainer" containerID="9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.595463 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lms7l"] Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.602371 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lms7l"] Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.616670 4958 scope.go:117] "RemoveContainer" containerID="b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.655826 4958 scope.go:117] "RemoveContainer" containerID="06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83" Dec 01 11:16:38 crc kubenswrapper[4958]: E1201 11:16:38.656542 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83\": container with ID starting with 06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83 not found: ID does not exist" containerID="06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.656585 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83"} err="failed to get container status \"06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83\": rpc error: code = NotFound desc = could not find container \"06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83\": container with ID starting with 06b948d97f2ad55cb70c7ca6686ef979279f18f2a43cfa614a327c6eb7a22d83 not found: ID does not exist" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.656606 4958 scope.go:117] "RemoveContainer" containerID="9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc" Dec 01 11:16:38 crc kubenswrapper[4958]: E1201 11:16:38.657128 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc\": container with ID starting with 9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc not found: ID does not exist" containerID="9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.657181 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc"} err="failed to get container status \"9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc\": rpc error: code = NotFound desc = could not find container \"9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc\": container with ID starting with 9199b19da0fe4c4ef4711200f49ccf8798d0c5af25fb9834c8321e30b712fffc not found: ID does not exist" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.657216 4958 scope.go:117] "RemoveContainer" containerID="b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b" Dec 01 11:16:38 crc kubenswrapper[4958]: E1201 11:16:38.657625 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b\": container with ID starting with b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b not found: ID does not exist" containerID="b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b" Dec 01 11:16:38 crc kubenswrapper[4958]: I1201 11:16:38.657655 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b"} err="failed to get container status \"b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b\": rpc error: code = NotFound desc = could not find container \"b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b\": container with ID starting with b9d33ef23820851cf19323699ab25221bfba07f64f8795321fc2de4935d1288b not found: ID does not exist" Dec 01 11:16:39 crc kubenswrapper[4958]: I1201 11:16:39.811535 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d714101e-0469-4677-a54c-ef27f8c21227" path="/var/lib/kubelet/pods/d714101e-0469-4677-a54c-ef27f8c21227/volumes" Dec 01 11:18:58 crc kubenswrapper[4958]: I1201 11:18:58.210373 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:18:58 crc kubenswrapper[4958]: I1201 11:18:58.210961 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.458484 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jspx5"] Dec 01 11:19:01 crc kubenswrapper[4958]: E1201 11:19:01.459133 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="extract-utilities" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.459150 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="extract-utilities" Dec 01 11:19:01 crc kubenswrapper[4958]: E1201 11:19:01.459169 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="registry-server" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.459175 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="registry-server" Dec 01 11:19:01 crc kubenswrapper[4958]: E1201 11:19:01.459197 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="extract-content" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.459202 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="extract-content" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.459353 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d714101e-0469-4677-a54c-ef27f8c21227" containerName="registry-server" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.460453 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.474614 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jspx5"] Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.633784 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-catalog-content\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.633832 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2cw8\" (UniqueName: \"kubernetes.io/projected/28961a04-18a1-43d4-bebb-365a533c8f47-kube-api-access-k2cw8\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.633874 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-utilities\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.735623 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-catalog-content\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.735712 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2cw8\" (UniqueName: \"kubernetes.io/projected/28961a04-18a1-43d4-bebb-365a533c8f47-kube-api-access-k2cw8\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.735764 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-utilities\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.736405 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-catalog-content\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.736646 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-utilities\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.756083 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2cw8\" (UniqueName: \"kubernetes.io/projected/28961a04-18a1-43d4-bebb-365a533c8f47-kube-api-access-k2cw8\") pod \"certified-operators-jspx5\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:01 crc kubenswrapper[4958]: I1201 11:19:01.787698 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:02 crc kubenswrapper[4958]: I1201 11:19:02.342584 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jspx5"] Dec 01 11:19:02 crc kubenswrapper[4958]: W1201 11:19:02.349390 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28961a04_18a1_43d4_bebb_365a533c8f47.slice/crio-993fa4c7c82ed1a3e0db1a6eb4299196ad7ba6feeed5019296d85ebbe4e13e3f WatchSource:0}: Error finding container 993fa4c7c82ed1a3e0db1a6eb4299196ad7ba6feeed5019296d85ebbe4e13e3f: Status 404 returned error can't find the container with id 993fa4c7c82ed1a3e0db1a6eb4299196ad7ba6feeed5019296d85ebbe4e13e3f Dec 01 11:19:03 crc kubenswrapper[4958]: I1201 11:19:03.050018 4958 generic.go:334] "Generic (PLEG): container finished" podID="28961a04-18a1-43d4-bebb-365a533c8f47" containerID="b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8" exitCode=0 Dec 01 11:19:03 crc kubenswrapper[4958]: I1201 11:19:03.050095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jspx5" event={"ID":"28961a04-18a1-43d4-bebb-365a533c8f47","Type":"ContainerDied","Data":"b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8"} Dec 01 11:19:03 crc kubenswrapper[4958]: I1201 11:19:03.050143 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jspx5" event={"ID":"28961a04-18a1-43d4-bebb-365a533c8f47","Type":"ContainerStarted","Data":"993fa4c7c82ed1a3e0db1a6eb4299196ad7ba6feeed5019296d85ebbe4e13e3f"} Dec 01 11:19:05 crc kubenswrapper[4958]: I1201 11:19:05.077911 4958 generic.go:334] "Generic (PLEG): container finished" podID="28961a04-18a1-43d4-bebb-365a533c8f47" containerID="27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6" exitCode=0 Dec 01 11:19:05 crc kubenswrapper[4958]: I1201 11:19:05.078343 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jspx5" event={"ID":"28961a04-18a1-43d4-bebb-365a533c8f47","Type":"ContainerDied","Data":"27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6"} Dec 01 11:19:06 crc kubenswrapper[4958]: I1201 11:19:06.089188 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jspx5" event={"ID":"28961a04-18a1-43d4-bebb-365a533c8f47","Type":"ContainerStarted","Data":"c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c"} Dec 01 11:19:06 crc kubenswrapper[4958]: I1201 11:19:06.121062 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jspx5" podStartSLOduration=2.655055746 podStartE2EDuration="5.121035342s" podCreationTimestamp="2025-12-01 11:19:01 +0000 UTC" firstStartedPulling="2025-12-01 11:19:03.053149328 +0000 UTC m=+4790.561938395" lastFinishedPulling="2025-12-01 11:19:05.519128914 +0000 UTC m=+4793.027917991" observedRunningTime="2025-12-01 11:19:06.111082982 +0000 UTC m=+4793.619872029" watchObservedRunningTime="2025-12-01 11:19:06.121035342 +0000 UTC m=+4793.629824369" Dec 01 11:19:11 crc kubenswrapper[4958]: I1201 11:19:11.788302 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:11 crc kubenswrapper[4958]: I1201 11:19:11.789092 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:11 crc kubenswrapper[4958]: I1201 11:19:11.854677 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:12 crc kubenswrapper[4958]: I1201 11:19:12.194700 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:12 crc kubenswrapper[4958]: I1201 11:19:12.246453 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jspx5"] Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.153520 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jspx5" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="registry-server" containerID="cri-o://c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c" gracePeriod=2 Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.561873 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.754837 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-utilities\") pod \"28961a04-18a1-43d4-bebb-365a533c8f47\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.754962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2cw8\" (UniqueName: \"kubernetes.io/projected/28961a04-18a1-43d4-bebb-365a533c8f47-kube-api-access-k2cw8\") pod \"28961a04-18a1-43d4-bebb-365a533c8f47\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.755099 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-catalog-content\") pod \"28961a04-18a1-43d4-bebb-365a533c8f47\" (UID: \"28961a04-18a1-43d4-bebb-365a533c8f47\") " Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.757342 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-utilities" (OuterVolumeSpecName: "utilities") pod "28961a04-18a1-43d4-bebb-365a533c8f47" (UID: "28961a04-18a1-43d4-bebb-365a533c8f47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.761804 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28961a04-18a1-43d4-bebb-365a533c8f47-kube-api-access-k2cw8" (OuterVolumeSpecName: "kube-api-access-k2cw8") pod "28961a04-18a1-43d4-bebb-365a533c8f47" (UID: "28961a04-18a1-43d4-bebb-365a533c8f47"). InnerVolumeSpecName "kube-api-access-k2cw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.857401 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:19:14 crc kubenswrapper[4958]: I1201 11:19:14.857435 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2cw8\" (UniqueName: \"kubernetes.io/projected/28961a04-18a1-43d4-bebb-365a533c8f47-kube-api-access-k2cw8\") on node \"crc\" DevicePath \"\"" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.163772 4958 generic.go:334] "Generic (PLEG): container finished" podID="28961a04-18a1-43d4-bebb-365a533c8f47" containerID="c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c" exitCode=0 Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.163834 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jspx5" event={"ID":"28961a04-18a1-43d4-bebb-365a533c8f47","Type":"ContainerDied","Data":"c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c"} Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.164177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jspx5" event={"ID":"28961a04-18a1-43d4-bebb-365a533c8f47","Type":"ContainerDied","Data":"993fa4c7c82ed1a3e0db1a6eb4299196ad7ba6feeed5019296d85ebbe4e13e3f"} Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.164199 4958 scope.go:117] "RemoveContainer" containerID="c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.163871 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jspx5" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.169071 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28961a04-18a1-43d4-bebb-365a533c8f47" (UID: "28961a04-18a1-43d4-bebb-365a533c8f47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.190546 4958 scope.go:117] "RemoveContainer" containerID="27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.210948 4958 scope.go:117] "RemoveContainer" containerID="b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.239058 4958 scope.go:117] "RemoveContainer" containerID="c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c" Dec 01 11:19:15 crc kubenswrapper[4958]: E1201 11:19:15.239647 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c\": container with ID starting with c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c not found: ID does not exist" containerID="c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.239699 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c"} err="failed to get container status \"c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c\": rpc error: code = NotFound desc = could not find container \"c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c\": container with ID starting with c2b882390d2c921012c2cb0f89c53d05d5f8ac139dda9f0ae144b0700946874c not found: ID does not exist" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.239732 4958 scope.go:117] "RemoveContainer" containerID="27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6" Dec 01 11:19:15 crc kubenswrapper[4958]: E1201 11:19:15.240240 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6\": container with ID starting with 27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6 not found: ID does not exist" containerID="27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.240276 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6"} err="failed to get container status \"27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6\": rpc error: code = NotFound desc = could not find container \"27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6\": container with ID starting with 27ee5b1f88b045c9f44a365a956f4e550458792bf97dfdc6e9666391155cfda6 not found: ID does not exist" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.240300 4958 scope.go:117] "RemoveContainer" containerID="b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8" Dec 01 11:19:15 crc kubenswrapper[4958]: E1201 11:19:15.240603 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8\": container with ID starting with b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8 not found: ID does not exist" containerID="b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.240630 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8"} err="failed to get container status \"b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8\": rpc error: code = NotFound desc = could not find container \"b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8\": container with ID starting with b31021e223c0b51057752f37c7c4254d8e8596e17d92de47a3d14ea814319de8 not found: ID does not exist" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.262631 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28961a04-18a1-43d4-bebb-365a533c8f47-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.518809 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jspx5"] Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.529834 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rl65m"] Dec 01 11:19:15 crc kubenswrapper[4958]: E1201 11:19:15.531102 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="registry-server" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.531340 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="registry-server" Dec 01 11:19:15 crc kubenswrapper[4958]: E1201 11:19:15.531546 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="extract-utilities" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.531787 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="extract-utilities" Dec 01 11:19:15 crc kubenswrapper[4958]: E1201 11:19:15.532090 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="extract-content" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.532247 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="extract-content" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.532689 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" containerName="registry-server" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.535172 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.539271 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jspx5"] Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.547012 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rl65m"] Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.670939 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-catalog-content\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.671005 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-utilities\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.671035 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2n2d\" (UniqueName: \"kubernetes.io/projected/c8516900-3230-4139-8c82-0c9e1e7e4942-kube-api-access-n2n2d\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.772466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-catalog-content\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.772586 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-utilities\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.772624 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2n2d\" (UniqueName: \"kubernetes.io/projected/c8516900-3230-4139-8c82-0c9e1e7e4942-kube-api-access-n2n2d\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.773065 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-catalog-content\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.773186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-utilities\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.791235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2n2d\" (UniqueName: \"kubernetes.io/projected/c8516900-3230-4139-8c82-0c9e1e7e4942-kube-api-access-n2n2d\") pod \"community-operators-rl65m\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.807929 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28961a04-18a1-43d4-bebb-365a533c8f47" path="/var/lib/kubelet/pods/28961a04-18a1-43d4-bebb-365a533c8f47/volumes" Dec 01 11:19:15 crc kubenswrapper[4958]: I1201 11:19:15.869592 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:16 crc kubenswrapper[4958]: I1201 11:19:16.398200 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rl65m"] Dec 01 11:19:16 crc kubenswrapper[4958]: W1201 11:19:16.407592 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8516900_3230_4139_8c82_0c9e1e7e4942.slice/crio-5d076514aadd7fd5112a3245e0d14f9ca87e5f76c20caae5264b66b04c93a8a8 WatchSource:0}: Error finding container 5d076514aadd7fd5112a3245e0d14f9ca87e5f76c20caae5264b66b04c93a8a8: Status 404 returned error can't find the container with id 5d076514aadd7fd5112a3245e0d14f9ca87e5f76c20caae5264b66b04c93a8a8 Dec 01 11:19:17 crc kubenswrapper[4958]: I1201 11:19:17.185921 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerID="70a0c0622301d5bef648b9bce9274d82d56335ac131dc5a4ea5186895b7cedf4" exitCode=0 Dec 01 11:19:17 crc kubenswrapper[4958]: I1201 11:19:17.185981 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerDied","Data":"70a0c0622301d5bef648b9bce9274d82d56335ac131dc5a4ea5186895b7cedf4"} Dec 01 11:19:17 crc kubenswrapper[4958]: I1201 11:19:17.186018 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerStarted","Data":"5d076514aadd7fd5112a3245e0d14f9ca87e5f76c20caae5264b66b04c93a8a8"} Dec 01 11:19:18 crc kubenswrapper[4958]: I1201 11:19:18.215190 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerStarted","Data":"8ffddfce18fb186d88f4faee8b72fdf4ee7a2e719d505d624fa3501777106036"} Dec 01 11:19:19 crc kubenswrapper[4958]: I1201 11:19:19.224108 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerID="8ffddfce18fb186d88f4faee8b72fdf4ee7a2e719d505d624fa3501777106036" exitCode=0 Dec 01 11:19:19 crc kubenswrapper[4958]: I1201 11:19:19.224201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerDied","Data":"8ffddfce18fb186d88f4faee8b72fdf4ee7a2e719d505d624fa3501777106036"} Dec 01 11:19:20 crc kubenswrapper[4958]: I1201 11:19:20.239539 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerStarted","Data":"b4b40a5b3119f4ace032ce3ee48c6cec0accd5e209455782f63444a2d9216f60"} Dec 01 11:19:20 crc kubenswrapper[4958]: I1201 11:19:20.262287 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rl65m" podStartSLOduration=2.707878859 podStartE2EDuration="5.262253336s" podCreationTimestamp="2025-12-01 11:19:15 +0000 UTC" firstStartedPulling="2025-12-01 11:19:17.188590479 +0000 UTC m=+4804.697379606" lastFinishedPulling="2025-12-01 11:19:19.742965036 +0000 UTC m=+4807.251754083" observedRunningTime="2025-12-01 11:19:20.25990206 +0000 UTC m=+4807.768691167" watchObservedRunningTime="2025-12-01 11:19:20.262253336 +0000 UTC m=+4807.771042443" Dec 01 11:19:25 crc kubenswrapper[4958]: I1201 11:19:25.870665 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:25 crc kubenswrapper[4958]: I1201 11:19:25.870940 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:25 crc kubenswrapper[4958]: I1201 11:19:25.930618 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:26 crc kubenswrapper[4958]: I1201 11:19:26.355807 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:26 crc kubenswrapper[4958]: I1201 11:19:26.422567 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rl65m"] Dec 01 11:19:28 crc kubenswrapper[4958]: I1201 11:19:28.210669 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:19:28 crc kubenswrapper[4958]: I1201 11:19:28.210757 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:19:28 crc kubenswrapper[4958]: I1201 11:19:28.314931 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rl65m" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="registry-server" containerID="cri-o://b4b40a5b3119f4ace032ce3ee48c6cec0accd5e209455782f63444a2d9216f60" gracePeriod=2 Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.325969 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerID="b4b40a5b3119f4ace032ce3ee48c6cec0accd5e209455782f63444a2d9216f60" exitCode=0 Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.326180 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerDied","Data":"b4b40a5b3119f4ace032ce3ee48c6cec0accd5e209455782f63444a2d9216f60"} Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.326541 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rl65m" event={"ID":"c8516900-3230-4139-8c82-0c9e1e7e4942","Type":"ContainerDied","Data":"5d076514aadd7fd5112a3245e0d14f9ca87e5f76c20caae5264b66b04c93a8a8"} Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.326569 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d076514aadd7fd5112a3245e0d14f9ca87e5f76c20caae5264b66b04c93a8a8" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.372770 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.443706 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2n2d\" (UniqueName: \"kubernetes.io/projected/c8516900-3230-4139-8c82-0c9e1e7e4942-kube-api-access-n2n2d\") pod \"c8516900-3230-4139-8c82-0c9e1e7e4942\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.443879 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-catalog-content\") pod \"c8516900-3230-4139-8c82-0c9e1e7e4942\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.445176 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-utilities\") pod \"c8516900-3230-4139-8c82-0c9e1e7e4942\" (UID: \"c8516900-3230-4139-8c82-0c9e1e7e4942\") " Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.446334 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-utilities" (OuterVolumeSpecName: "utilities") pod "c8516900-3230-4139-8c82-0c9e1e7e4942" (UID: "c8516900-3230-4139-8c82-0c9e1e7e4942"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.453334 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8516900-3230-4139-8c82-0c9e1e7e4942-kube-api-access-n2n2d" (OuterVolumeSpecName: "kube-api-access-n2n2d") pod "c8516900-3230-4139-8c82-0c9e1e7e4942" (UID: "c8516900-3230-4139-8c82-0c9e1e7e4942"). InnerVolumeSpecName "kube-api-access-n2n2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.509702 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8516900-3230-4139-8c82-0c9e1e7e4942" (UID: "c8516900-3230-4139-8c82-0c9e1e7e4942"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.546163 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.546196 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8516900-3230-4139-8c82-0c9e1e7e4942-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:19:29 crc kubenswrapper[4958]: I1201 11:19:29.546206 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2n2d\" (UniqueName: \"kubernetes.io/projected/c8516900-3230-4139-8c82-0c9e1e7e4942-kube-api-access-n2n2d\") on node \"crc\" DevicePath \"\"" Dec 01 11:19:30 crc kubenswrapper[4958]: I1201 11:19:30.337394 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rl65m" Dec 01 11:19:30 crc kubenswrapper[4958]: I1201 11:19:30.377733 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rl65m"] Dec 01 11:19:30 crc kubenswrapper[4958]: I1201 11:19:30.394481 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rl65m"] Dec 01 11:19:31 crc kubenswrapper[4958]: I1201 11:19:31.814127 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" path="/var/lib/kubelet/pods/c8516900-3230-4139-8c82-0c9e1e7e4942/volumes" Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.210662 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.212309 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.212451 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.213241 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01cdf88f930e439a7789701e601c7f4fd6042b8650b40106465e84317ace78b8"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.213413 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://01cdf88f930e439a7789701e601c7f4fd6042b8650b40106465e84317ace78b8" gracePeriod=600 Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.596408 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="01cdf88f930e439a7789701e601c7f4fd6042b8650b40106465e84317ace78b8" exitCode=0 Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.596465 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"01cdf88f930e439a7789701e601c7f4fd6042b8650b40106465e84317ace78b8"} Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.596774 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d"} Dec 01 11:19:58 crc kubenswrapper[4958]: I1201 11:19:58.596803 4958 scope.go:117] "RemoveContainer" containerID="2ebf330e2212d356b44bf603c10712926e37894f2a7038a06ff86a778999dfa3" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.814049 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q2grr"] Dec 01 11:20:55 crc kubenswrapper[4958]: E1201 11:20:55.815441 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="registry-server" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.815475 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="registry-server" Dec 01 11:20:55 crc kubenswrapper[4958]: E1201 11:20:55.815503 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="extract-content" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.815520 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="extract-content" Dec 01 11:20:55 crc kubenswrapper[4958]: E1201 11:20:55.815572 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="extract-utilities" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.815590 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="extract-utilities" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.816075 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8516900-3230-4139-8c82-0c9e1e7e4942" containerName="registry-server" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.827150 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2grr"] Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.827294 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.968822 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-catalog-content\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.968965 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-utilities\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:55 crc kubenswrapper[4958]: I1201 11:20:55.969007 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8m2\" (UniqueName: \"kubernetes.io/projected/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-kube-api-access-zz8m2\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.070592 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-catalog-content\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.070691 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-utilities\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.070716 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8m2\" (UniqueName: \"kubernetes.io/projected/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-kube-api-access-zz8m2\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.071564 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-utilities\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.071598 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-catalog-content\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.093779 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8m2\" (UniqueName: \"kubernetes.io/projected/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-kube-api-access-zz8m2\") pod \"redhat-marketplace-q2grr\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.165125 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:20:56 crc kubenswrapper[4958]: I1201 11:20:56.679351 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2grr"] Dec 01 11:20:57 crc kubenswrapper[4958]: I1201 11:20:57.200885 4958 generic.go:334] "Generic (PLEG): container finished" podID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerID="5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826" exitCode=0 Dec 01 11:20:57 crc kubenswrapper[4958]: I1201 11:20:57.200990 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerDied","Data":"5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826"} Dec 01 11:20:57 crc kubenswrapper[4958]: I1201 11:20:57.201259 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerStarted","Data":"6c0202d9a478cb14200a40b4113d18d737a8e605694b3d8ab142577872c0096f"} Dec 01 11:20:58 crc kubenswrapper[4958]: I1201 11:20:58.209706 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerStarted","Data":"4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6"} Dec 01 11:20:59 crc kubenswrapper[4958]: I1201 11:20:59.223276 4958 generic.go:334] "Generic (PLEG): container finished" podID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerID="4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6" exitCode=0 Dec 01 11:20:59 crc kubenswrapper[4958]: I1201 11:20:59.223415 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerDied","Data":"4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6"} Dec 01 11:21:00 crc kubenswrapper[4958]: I1201 11:21:00.237629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerStarted","Data":"0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0"} Dec 01 11:21:00 crc kubenswrapper[4958]: I1201 11:21:00.266641 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q2grr" podStartSLOduration=2.713413305 podStartE2EDuration="5.266621159s" podCreationTimestamp="2025-12-01 11:20:55 +0000 UTC" firstStartedPulling="2025-12-01 11:20:57.203004095 +0000 UTC m=+4904.711793132" lastFinishedPulling="2025-12-01 11:20:59.756211949 +0000 UTC m=+4907.265000986" observedRunningTime="2025-12-01 11:21:00.264791178 +0000 UTC m=+4907.773580215" watchObservedRunningTime="2025-12-01 11:21:00.266621159 +0000 UTC m=+4907.775410196" Dec 01 11:21:06 crc kubenswrapper[4958]: I1201 11:21:06.166377 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:21:06 crc kubenswrapper[4958]: I1201 11:21:06.167193 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:21:06 crc kubenswrapper[4958]: I1201 11:21:06.244914 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:21:06 crc kubenswrapper[4958]: I1201 11:21:06.380611 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:21:06 crc kubenswrapper[4958]: I1201 11:21:06.503966 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2grr"] Dec 01 11:21:08 crc kubenswrapper[4958]: I1201 11:21:08.318062 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q2grr" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="registry-server" containerID="cri-o://0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0" gracePeriod=2 Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.154192 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.326767 4958 generic.go:334] "Generic (PLEG): container finished" podID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerID="0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0" exitCode=0 Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.326822 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerDied","Data":"0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0"} Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.326888 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2grr" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.326915 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2grr" event={"ID":"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa","Type":"ContainerDied","Data":"6c0202d9a478cb14200a40b4113d18d737a8e605694b3d8ab142577872c0096f"} Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.326962 4958 scope.go:117] "RemoveContainer" containerID="0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.339665 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-catalog-content\") pod \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.339736 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-utilities\") pod \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.339810 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz8m2\" (UniqueName: \"kubernetes.io/projected/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-kube-api-access-zz8m2\") pod \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\" (UID: \"6dc4b73c-d49d-4fa6-bf5e-887f39c237aa\") " Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.340686 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-utilities" (OuterVolumeSpecName: "utilities") pod "6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" (UID: "6dc4b73c-d49d-4fa6-bf5e-887f39c237aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.347199 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-kube-api-access-zz8m2" (OuterVolumeSpecName: "kube-api-access-zz8m2") pod "6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" (UID: "6dc4b73c-d49d-4fa6-bf5e-887f39c237aa"). InnerVolumeSpecName "kube-api-access-zz8m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.347376 4958 scope.go:117] "RemoveContainer" containerID="4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.363839 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" (UID: "6dc4b73c-d49d-4fa6-bf5e-887f39c237aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.388160 4958 scope.go:117] "RemoveContainer" containerID="5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.403269 4958 scope.go:117] "RemoveContainer" containerID="0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0" Dec 01 11:21:09 crc kubenswrapper[4958]: E1201 11:21:09.403794 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0\": container with ID starting with 0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0 not found: ID does not exist" containerID="0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.403871 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0"} err="failed to get container status \"0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0\": rpc error: code = NotFound desc = could not find container \"0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0\": container with ID starting with 0b248dcc7361cab912d350f59288ecba0137fe318220178ed592deb3f9d770f0 not found: ID does not exist" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.403902 4958 scope.go:117] "RemoveContainer" containerID="4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6" Dec 01 11:21:09 crc kubenswrapper[4958]: E1201 11:21:09.404260 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6\": container with ID starting with 4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6 not found: ID does not exist" containerID="4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.404347 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6"} err="failed to get container status \"4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6\": rpc error: code = NotFound desc = could not find container \"4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6\": container with ID starting with 4cfafb4cb0c41c46f472d61f1f575ed16069cea73ed5b69ff7c0d175d54691a6 not found: ID does not exist" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.404409 4958 scope.go:117] "RemoveContainer" containerID="5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826" Dec 01 11:21:09 crc kubenswrapper[4958]: E1201 11:21:09.404896 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826\": container with ID starting with 5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826 not found: ID does not exist" containerID="5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.404947 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826"} err="failed to get container status \"5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826\": rpc error: code = NotFound desc = could not find container \"5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826\": container with ID starting with 5dbfeb1c2ffc5741fc579a866dfd9d6ca1939497941eb517359555aa4bb4b826 not found: ID does not exist" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.442223 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.442258 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.442271 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz8m2\" (UniqueName: \"kubernetes.io/projected/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa-kube-api-access-zz8m2\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.675534 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2grr"] Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.691079 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2grr"] Dec 01 11:21:09 crc kubenswrapper[4958]: I1201 11:21:09.815113 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" path="/var/lib/kubelet/pods/6dc4b73c-d49d-4fa6-bf5e-887f39c237aa/volumes" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.021316 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-v7749"] Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.029550 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-v7749"] Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.169414 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zbzg5"] Dec 01 11:21:25 crc kubenswrapper[4958]: E1201 11:21:25.169801 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="registry-server" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.169817 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="registry-server" Dec 01 11:21:25 crc kubenswrapper[4958]: E1201 11:21:25.169838 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="extract-content" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.169869 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="extract-content" Dec 01 11:21:25 crc kubenswrapper[4958]: E1201 11:21:25.169887 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="extract-utilities" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.169897 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="extract-utilities" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.170086 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc4b73c-d49d-4fa6-bf5e-887f39c237aa" containerName="registry-server" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.170688 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.173768 4958 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-mnz7h" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.174187 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.174744 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.175280 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.193768 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zbzg5"] Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.237705 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnln\" (UniqueName: \"kubernetes.io/projected/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-kube-api-access-prnln\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.238063 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-crc-storage\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.238200 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-node-mnt\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.340642 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-crc-storage\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.340746 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-node-mnt\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.340940 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prnln\" (UniqueName: \"kubernetes.io/projected/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-kube-api-access-prnln\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.341603 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-node-mnt\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.342318 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-crc-storage\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.379709 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnln\" (UniqueName: \"kubernetes.io/projected/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-kube-api-access-prnln\") pod \"crc-storage-crc-zbzg5\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.498546 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.776791 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zbzg5"] Dec 01 11:21:25 crc kubenswrapper[4958]: W1201 11:21:25.787105 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0937161d_0d32_4acf_b1a0_ad3a27b7c3b3.slice/crio-7c2974dafa38c68d8c707326837f8fe62cab01d07e6f268814d35534654f4201 WatchSource:0}: Error finding container 7c2974dafa38c68d8c707326837f8fe62cab01d07e6f268814d35534654f4201: Status 404 returned error can't find the container with id 7c2974dafa38c68d8c707326837f8fe62cab01d07e6f268814d35534654f4201 Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.791753 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:21:25 crc kubenswrapper[4958]: I1201 11:21:25.812912 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24715a95-e6a7-4208-8bff-92a10e50cb9e" path="/var/lib/kubelet/pods/24715a95-e6a7-4208-8bff-92a10e50cb9e/volumes" Dec 01 11:21:26 crc kubenswrapper[4958]: I1201 11:21:26.516682 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zbzg5" event={"ID":"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3","Type":"ContainerStarted","Data":"7c2974dafa38c68d8c707326837f8fe62cab01d07e6f268814d35534654f4201"} Dec 01 11:21:27 crc kubenswrapper[4958]: I1201 11:21:27.530681 4958 generic.go:334] "Generic (PLEG): container finished" podID="0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" containerID="5b35fcff2b22cb1f7ad58cfd6cedf9ed7735ac640d145cb976d5e5ff265e4c82" exitCode=0 Dec 01 11:21:27 crc kubenswrapper[4958]: I1201 11:21:27.530790 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zbzg5" event={"ID":"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3","Type":"ContainerDied","Data":"5b35fcff2b22cb1f7ad58cfd6cedf9ed7735ac640d145cb976d5e5ff265e4c82"} Dec 01 11:21:28 crc kubenswrapper[4958]: I1201 11:21:28.874487 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:28 crc kubenswrapper[4958]: I1201 11:21:28.996453 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-crc-storage\") pod \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " Dec 01 11:21:28 crc kubenswrapper[4958]: I1201 11:21:28.996522 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prnln\" (UniqueName: \"kubernetes.io/projected/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-kube-api-access-prnln\") pod \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " Dec 01 11:21:28 crc kubenswrapper[4958]: I1201 11:21:28.996606 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-node-mnt\") pod \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\" (UID: \"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3\") " Dec 01 11:21:28 crc kubenswrapper[4958]: I1201 11:21:28.996813 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" (UID: "0937161d-0d32-4acf-b1a0-ad3a27b7c3b3"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:21:28 crc kubenswrapper[4958]: I1201 11:21:28.996982 4958 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.006072 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-kube-api-access-prnln" (OuterVolumeSpecName: "kube-api-access-prnln") pod "0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" (UID: "0937161d-0d32-4acf-b1a0-ad3a27b7c3b3"). InnerVolumeSpecName "kube-api-access-prnln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.030797 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" (UID: "0937161d-0d32-4acf-b1a0-ad3a27b7c3b3"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.098138 4958 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.098174 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prnln\" (UniqueName: \"kubernetes.io/projected/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3-kube-api-access-prnln\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.555186 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zbzg5" event={"ID":"0937161d-0d32-4acf-b1a0-ad3a27b7c3b3","Type":"ContainerDied","Data":"7c2974dafa38c68d8c707326837f8fe62cab01d07e6f268814d35534654f4201"} Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.555271 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c2974dafa38c68d8c707326837f8fe62cab01d07e6f268814d35534654f4201" Dec 01 11:21:29 crc kubenswrapper[4958]: I1201 11:21:29.555355 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zbzg5" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.111738 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-zbzg5"] Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.125764 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-zbzg5"] Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.256242 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-nj5vt"] Dec 01 11:21:31 crc kubenswrapper[4958]: E1201 11:21:31.256986 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" containerName="storage" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.257015 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" containerName="storage" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.257205 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" containerName="storage" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.257941 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.261336 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.263634 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.263707 4958 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-mnz7h" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.264219 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.278748 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nj5vt"] Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.436332 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qp8d\" (UniqueName: \"kubernetes.io/projected/042b44a9-160d-4a7d-84c9-64931ad3363f-kube-api-access-6qp8d\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.436967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/042b44a9-160d-4a7d-84c9-64931ad3363f-crc-storage\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.437239 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/042b44a9-160d-4a7d-84c9-64931ad3363f-node-mnt\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.538288 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/042b44a9-160d-4a7d-84c9-64931ad3363f-crc-storage\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.539352 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/042b44a9-160d-4a7d-84c9-64931ad3363f-node-mnt\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.539446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/042b44a9-160d-4a7d-84c9-64931ad3363f-crc-storage\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.539637 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qp8d\" (UniqueName: \"kubernetes.io/projected/042b44a9-160d-4a7d-84c9-64931ad3363f-kube-api-access-6qp8d\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.539664 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/042b44a9-160d-4a7d-84c9-64931ad3363f-node-mnt\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.574018 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qp8d\" (UniqueName: \"kubernetes.io/projected/042b44a9-160d-4a7d-84c9-64931ad3363f-kube-api-access-6qp8d\") pod \"crc-storage-crc-nj5vt\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.590642 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:31 crc kubenswrapper[4958]: I1201 11:21:31.807877 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0937161d-0d32-4acf-b1a0-ad3a27b7c3b3" path="/var/lib/kubelet/pods/0937161d-0d32-4acf-b1a0-ad3a27b7c3b3/volumes" Dec 01 11:21:32 crc kubenswrapper[4958]: I1201 11:21:32.099066 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nj5vt"] Dec 01 11:21:32 crc kubenswrapper[4958]: I1201 11:21:32.589240 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nj5vt" event={"ID":"042b44a9-160d-4a7d-84c9-64931ad3363f","Type":"ContainerStarted","Data":"81938053a43213ba9cea0efe0ceff2ab9e6bfa62e642f3b33b9bfc33af46753c"} Dec 01 11:21:33 crc kubenswrapper[4958]: I1201 11:21:33.604964 4958 generic.go:334] "Generic (PLEG): container finished" podID="042b44a9-160d-4a7d-84c9-64931ad3363f" containerID="61a79d97e4b56cbb64ddd33b9f87be81c4e09d358fdc2c5fd15bf9c67c887e40" exitCode=0 Dec 01 11:21:33 crc kubenswrapper[4958]: I1201 11:21:33.605297 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nj5vt" event={"ID":"042b44a9-160d-4a7d-84c9-64931ad3363f","Type":"ContainerDied","Data":"61a79d97e4b56cbb64ddd33b9f87be81c4e09d358fdc2c5fd15bf9c67c887e40"} Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.553005 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.633725 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nj5vt" event={"ID":"042b44a9-160d-4a7d-84c9-64931ad3363f","Type":"ContainerDied","Data":"81938053a43213ba9cea0efe0ceff2ab9e6bfa62e642f3b33b9bfc33af46753c"} Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.633787 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81938053a43213ba9cea0efe0ceff2ab9e6bfa62e642f3b33b9bfc33af46753c" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.633942 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nj5vt" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.710043 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qp8d\" (UniqueName: \"kubernetes.io/projected/042b44a9-160d-4a7d-84c9-64931ad3363f-kube-api-access-6qp8d\") pod \"042b44a9-160d-4a7d-84c9-64931ad3363f\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.710134 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/042b44a9-160d-4a7d-84c9-64931ad3363f-crc-storage\") pod \"042b44a9-160d-4a7d-84c9-64931ad3363f\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.710276 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/042b44a9-160d-4a7d-84c9-64931ad3363f-node-mnt\") pod \"042b44a9-160d-4a7d-84c9-64931ad3363f\" (UID: \"042b44a9-160d-4a7d-84c9-64931ad3363f\") " Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.710577 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/042b44a9-160d-4a7d-84c9-64931ad3363f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "042b44a9-160d-4a7d-84c9-64931ad3363f" (UID: "042b44a9-160d-4a7d-84c9-64931ad3363f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.716639 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042b44a9-160d-4a7d-84c9-64931ad3363f-kube-api-access-6qp8d" (OuterVolumeSpecName: "kube-api-access-6qp8d") pod "042b44a9-160d-4a7d-84c9-64931ad3363f" (UID: "042b44a9-160d-4a7d-84c9-64931ad3363f"). InnerVolumeSpecName "kube-api-access-6qp8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.735148 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/042b44a9-160d-4a7d-84c9-64931ad3363f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "042b44a9-160d-4a7d-84c9-64931ad3363f" (UID: "042b44a9-160d-4a7d-84c9-64931ad3363f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.811528 4958 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/042b44a9-160d-4a7d-84c9-64931ad3363f-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.811580 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qp8d\" (UniqueName: \"kubernetes.io/projected/042b44a9-160d-4a7d-84c9-64931ad3363f-kube-api-access-6qp8d\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:35 crc kubenswrapper[4958]: I1201 11:21:35.811603 4958 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/042b44a9-160d-4a7d-84c9-64931ad3363f-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 01 11:21:58 crc kubenswrapper[4958]: I1201 11:21:58.211318 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:21:58 crc kubenswrapper[4958]: I1201 11:21:58.211976 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:22:22 crc kubenswrapper[4958]: I1201 11:22:22.516905 4958 scope.go:117] "RemoveContainer" containerID="f2e9ee82e9f79caa3babbdd1502de2961277e032f8f0ecb08906a21621608a8e" Dec 01 11:22:28 crc kubenswrapper[4958]: I1201 11:22:28.257097 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:22:28 crc kubenswrapper[4958]: I1201 11:22:28.260506 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.210425 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.212439 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.212595 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.213461 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.213631 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" gracePeriod=600 Dec 01 11:22:58 crc kubenswrapper[4958]: E1201 11:22:58.367565 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.584945 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" exitCode=0 Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.585092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d"} Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.585262 4958 scope.go:117] "RemoveContainer" containerID="01cdf88f930e439a7789701e601c7f4fd6042b8650b40106465e84317ace78b8" Dec 01 11:22:58 crc kubenswrapper[4958]: I1201 11:22:58.585967 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:22:58 crc kubenswrapper[4958]: E1201 11:22:58.586260 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:23:10 crc kubenswrapper[4958]: I1201 11:23:10.798035 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:23:10 crc kubenswrapper[4958]: E1201 11:23:10.798982 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:23:21 crc kubenswrapper[4958]: I1201 11:23:21.797941 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:23:21 crc kubenswrapper[4958]: E1201 11:23:21.799311 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:23:33 crc kubenswrapper[4958]: I1201 11:23:33.802519 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:23:33 crc kubenswrapper[4958]: E1201 11:23:33.805462 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:23:46 crc kubenswrapper[4958]: I1201 11:23:46.797479 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:23:46 crc kubenswrapper[4958]: E1201 11:23:46.798176 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:23:57 crc kubenswrapper[4958]: I1201 11:23:57.797255 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:23:57 crc kubenswrapper[4958]: E1201 11:23:57.798345 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:24:08 crc kubenswrapper[4958]: I1201 11:24:08.798486 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:24:08 crc kubenswrapper[4958]: E1201 11:24:08.799723 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:24:22 crc kubenswrapper[4958]: I1201 11:24:22.798515 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:24:22 crc kubenswrapper[4958]: E1201 11:24:22.799630 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:24:37 crc kubenswrapper[4958]: I1201 11:24:37.797739 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:24:37 crc kubenswrapper[4958]: E1201 11:24:37.798977 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:24:52 crc kubenswrapper[4958]: I1201 11:24:52.797950 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:24:52 crc kubenswrapper[4958]: E1201 11:24:52.799176 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.258419 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-h4j7j"] Dec 01 11:25:02 crc kubenswrapper[4958]: E1201 11:25:02.259377 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042b44a9-160d-4a7d-84c9-64931ad3363f" containerName="storage" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.259398 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="042b44a9-160d-4a7d-84c9-64931ad3363f" containerName="storage" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.259600 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="042b44a9-160d-4a7d-84c9-64931ad3363f" containerName="storage" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.263374 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.281427 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.282407 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.282513 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9c2gr" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.283275 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.283793 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.296779 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-h4j7j"] Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.466243 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-config\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.466365 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbdjt\" (UniqueName: \"kubernetes.io/projected/01594c3a-61db-4578-b3f8-5282a9c490ba-kube-api-access-wbdjt\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.466439 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.518022 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-b2hn6"] Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.519225 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.543019 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-b2hn6"] Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.567996 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.568063 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-config\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.568110 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbdjt\" (UniqueName: \"kubernetes.io/projected/01594c3a-61db-4578-b3f8-5282a9c490ba-kube-api-access-wbdjt\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.569357 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.570042 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-config\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.598861 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbdjt\" (UniqueName: \"kubernetes.io/projected/01594c3a-61db-4578-b3f8-5282a9c490ba-kube-api-access-wbdjt\") pod \"dnsmasq-dns-5d7b5456f5-h4j7j\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.669098 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.669168 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6q64\" (UniqueName: \"kubernetes.io/projected/1bab268c-f167-4733-adb1-38948a8833f2-kube-api-access-z6q64\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.669203 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-config\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.770795 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.770927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6q64\" (UniqueName: \"kubernetes.io/projected/1bab268c-f167-4733-adb1-38948a8833f2-kube-api-access-z6q64\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.770967 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-config\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.771881 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.772086 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-config\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.788298 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6q64\" (UniqueName: \"kubernetes.io/projected/1bab268c-f167-4733-adb1-38948a8833f2-kube-api-access-z6q64\") pod \"dnsmasq-dns-98ddfc8f-b2hn6\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.836222 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:02 crc kubenswrapper[4958]: I1201 11:25:02.895518 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.204047 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-h4j7j"] Dec 01 11:25:03 crc kubenswrapper[4958]: W1201 11:25:03.209142 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01594c3a_61db_4578_b3f8_5282a9c490ba.slice/crio-ff063d3b587441e5f233f0ad5e1a3e6dbef6ef52ef4205a7770a5a28cd900dd4 WatchSource:0}: Error finding container ff063d3b587441e5f233f0ad5e1a3e6dbef6ef52ef4205a7770a5a28cd900dd4: Status 404 returned error can't find the container with id ff063d3b587441e5f233f0ad5e1a3e6dbef6ef52ef4205a7770a5a28cd900dd4 Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.311241 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-b2hn6"] Dec 01 11:25:03 crc kubenswrapper[4958]: W1201 11:25:03.327045 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bab268c_f167_4733_adb1_38948a8833f2.slice/crio-b0f9e2f1687f26363631d0785bc43cfe8fb45a38b671f9efa66f2b99cbb7f9fa WatchSource:0}: Error finding container b0f9e2f1687f26363631d0785bc43cfe8fb45a38b671f9efa66f2b99cbb7f9fa: Status 404 returned error can't find the container with id b0f9e2f1687f26363631d0785bc43cfe8fb45a38b671f9efa66f2b99cbb7f9fa Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.405826 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.408034 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.414160 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.414405 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.414434 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.414434 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.415942 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wxr94" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.437539 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590226 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590307 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590343 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590370 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590399 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z7db\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-kube-api-access-7z7db\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590421 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590446 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590471 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.590500 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.691692 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692089 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692142 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692185 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692211 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692237 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692270 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692287 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692301 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z7db\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-kube-api-access-7z7db\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692432 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.692932 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.693304 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.693533 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.712891 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.724230 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z7db\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-kube-api-access-7z7db\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.724625 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.725200 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.726176 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.727488 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.729610 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.729668 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/232d4b043d54b34e4f8229ea02114265740b3a89b60b1d1c21d659c1a226522f/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.730494 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.730820 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.731013 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7pbhp" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.731184 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.731442 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.741918 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.797681 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " pod="openstack/rabbitmq-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.894932 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.894994 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbcc\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-kube-api-access-wqbcc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895048 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895063 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895166 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ead29d35-e6ff-421e-82a8-832a0ae919eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895302 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895413 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895438 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ead29d35-e6ff-421e-82a8-832a0ae919eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.895468 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ead29d35-e6ff-421e-82a8-832a0ae919eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996771 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996792 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996816 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ead29d35-e6ff-421e-82a8-832a0ae919eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996867 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996915 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996945 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbcc\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-kube-api-access-wqbcc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996970 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.996984 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.998552 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.998687 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.999123 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:03 crc kubenswrapper[4958]: I1201 11:25:03.999439 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.001182 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.002130 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ead29d35-e6ff-421e-82a8-832a0ae919eb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.015114 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbcc\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-kube-api-access-wqbcc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.015128 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ead29d35-e6ff-421e-82a8-832a0ae919eb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.015310 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.015339 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/64422f1485a4f12edd0674d161f7225abd2c33822b0e4e49cbffe0d3f821b510/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.044668 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.075015 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.097705 4958 generic.go:334] "Generic (PLEG): container finished" podID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerID="94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189" exitCode=0 Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.097958 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" event={"ID":"01594c3a-61db-4578-b3f8-5282a9c490ba","Type":"ContainerDied","Data":"94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189"} Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.098010 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" event={"ID":"01594c3a-61db-4578-b3f8-5282a9c490ba","Type":"ContainerStarted","Data":"ff063d3b587441e5f233f0ad5e1a3e6dbef6ef52ef4205a7770a5a28cd900dd4"} Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.100479 4958 generic.go:334] "Generic (PLEG): container finished" podID="1bab268c-f167-4733-adb1-38948a8833f2" containerID="5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de" exitCode=0 Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.100527 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" event={"ID":"1bab268c-f167-4733-adb1-38948a8833f2","Type":"ContainerDied","Data":"5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de"} Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.100558 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" event={"ID":"1bab268c-f167-4733-adb1-38948a8833f2","Type":"ContainerStarted","Data":"b0f9e2f1687f26363631d0785bc43cfe8fb45a38b671f9efa66f2b99cbb7f9fa"} Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.100566 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.532328 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:25:04 crc kubenswrapper[4958]: W1201 11:25:04.539922 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dbed77a_cf9a_442d_bb6a_e7de1fad7ea6.slice/crio-6b9e7c8969d9c33e3ea6d439dfbbcd884b78c09b9fd0c56d9c52641749b7e4a6 WatchSource:0}: Error finding container 6b9e7c8969d9c33e3ea6d439dfbbcd884b78c09b9fd0c56d9c52641749b7e4a6: Status 404 returned error can't find the container with id 6b9e7c8969d9c33e3ea6d439dfbbcd884b78c09b9fd0c56d9c52641749b7e4a6 Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.645479 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.840255 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.850082 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.853857 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.856750 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.858530 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.859975 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.860025 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.860131 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7tbc2" Dec 01 11:25:04 crc kubenswrapper[4958]: I1201 11:25:04.869043 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014446 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-kolla-config\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014511 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014541 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzf64\" (UniqueName: \"kubernetes.io/projected/05601e59-50a7-45b6-a41d-d872a023490b-kube-api-access-qzf64\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014573 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-secrets\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014601 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05601e59-50a7-45b6-a41d-d872a023490b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014758 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-config-data-default\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014905 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.014993 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.015050 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.113197 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6","Type":"ContainerStarted","Data":"6b9e7c8969d9c33e3ea6d439dfbbcd884b78c09b9fd0c56d9c52641749b7e4a6"} Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.116883 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.116967 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzf64\" (UniqueName: \"kubernetes.io/projected/05601e59-50a7-45b6-a41d-d872a023490b-kube-api-access-qzf64\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117013 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-secrets\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117049 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05601e59-50a7-45b6-a41d-d872a023490b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-config-data-default\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117128 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117170 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117203 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.117265 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-kolla-config\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.122102 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.122140 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c3b01979794400da5d24fc52bb7b03f7d754b7c8eac350c05b5724340c9adbf7/globalmount\"" pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.124416 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" event={"ID":"01594c3a-61db-4578-b3f8-5282a9c490ba","Type":"ContainerStarted","Data":"f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433"} Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.124522 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.125346 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-kolla-config\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.126421 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ead29d35-e6ff-421e-82a8-832a0ae919eb","Type":"ContainerStarted","Data":"fd41f32554111ec9f5c2cb9bd0251aa0c7399a9efbeae2dd8c068a0995527ed2"} Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.126948 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05601e59-50a7-45b6-a41d-d872a023490b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.128180 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.131440 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05601e59-50a7-45b6-a41d-d872a023490b-config-data-default\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.133953 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" event={"ID":"1bab268c-f167-4733-adb1-38948a8833f2","Type":"ContainerStarted","Data":"dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336"} Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.135001 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.163696 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-secrets\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.164257 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.166145 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05601e59-50a7-45b6-a41d-d872a023490b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.166390 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzf64\" (UniqueName: \"kubernetes.io/projected/05601e59-50a7-45b6-a41d-d872a023490b-kube-api-access-qzf64\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.166932 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" podStartSLOduration=3.16691249 podStartE2EDuration="3.16691249s" podCreationTimestamp="2025-12-01 11:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:05.148305804 +0000 UTC m=+5152.657094841" watchObservedRunningTime="2025-12-01 11:25:05.16691249 +0000 UTC m=+5152.675701527" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.167615 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" podStartSLOduration=3.16761161 podStartE2EDuration="3.16761161s" podCreationTimestamp="2025-12-01 11:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:05.166095647 +0000 UTC m=+5152.674884684" watchObservedRunningTime="2025-12-01 11:25:05.16761161 +0000 UTC m=+5152.676400647" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.216426 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.217486 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.219491 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.221514 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-krf2m" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.231535 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.285504 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02a4db85-926a-4f7e-81b6-0e9b5e50ad2f\") pod \"openstack-galera-0\" (UID: \"05601e59-50a7-45b6-a41d-d872a023490b\") " pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.320424 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a058a864-a8ee-43b1-8662-688c39309094-kolla-config\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.320490 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a058a864-a8ee-43b1-8662-688c39309094-config-data\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.320618 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhswm\" (UniqueName: \"kubernetes.io/projected/a058a864-a8ee-43b1-8662-688c39309094-kube-api-access-zhswm\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.422398 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhswm\" (UniqueName: \"kubernetes.io/projected/a058a864-a8ee-43b1-8662-688c39309094-kube-api-access-zhswm\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.422521 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a058a864-a8ee-43b1-8662-688c39309094-kolla-config\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.422564 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a058a864-a8ee-43b1-8662-688c39309094-config-data\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.423569 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a058a864-a8ee-43b1-8662-688c39309094-kolla-config\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.423876 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a058a864-a8ee-43b1-8662-688c39309094-config-data\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.440673 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhswm\" (UniqueName: \"kubernetes.io/projected/a058a864-a8ee-43b1-8662-688c39309094-kube-api-access-zhswm\") pod \"memcached-0\" (UID: \"a058a864-a8ee-43b1-8662-688c39309094\") " pod="openstack/memcached-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.561772 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 11:25:05 crc kubenswrapper[4958]: I1201 11:25:05.564495 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.020664 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 11:25:06 crc kubenswrapper[4958]: W1201 11:25:06.023601 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05601e59_50a7_45b6_a41d_d872a023490b.slice/crio-9bb2bec0daaf53925697382fd79f5a93d8b503ef197ba428279ff3a06522e981 WatchSource:0}: Error finding container 9bb2bec0daaf53925697382fd79f5a93d8b503ef197ba428279ff3a06522e981: Status 404 returned error can't find the container with id 9bb2bec0daaf53925697382fd79f5a93d8b503ef197ba428279ff3a06522e981 Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.089350 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.144092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05601e59-50a7-45b6-a41d-d872a023490b","Type":"ContainerStarted","Data":"9bb2bec0daaf53925697382fd79f5a93d8b503ef197ba428279ff3a06522e981"} Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.188006 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.190213 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.192515 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rcgnj" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.192594 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.192994 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.193110 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.202627 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337271 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78ca54d7-1a20-4f31-9f23-a1febc785c3d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337530 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337554 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337594 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337619 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337690 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337747 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49wg8\" (UniqueName: \"kubernetes.io/projected/78ca54d7-1a20-4f31-9f23-a1febc785c3d-kube-api-access-49wg8\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337766 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.337791 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438667 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438718 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438755 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438813 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49wg8\" (UniqueName: \"kubernetes.io/projected/78ca54d7-1a20-4f31-9f23-a1febc785c3d-kube-api-access-49wg8\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438833 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438874 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438897 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78ca54d7-1a20-4f31-9f23-a1febc785c3d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438913 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.438935 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.440075 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.440734 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.441815 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78ca54d7-1a20-4f31-9f23-a1febc785c3d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.442378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78ca54d7-1a20-4f31-9f23-a1febc785c3d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.447305 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.447353 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e543f7e79882d0983255a4b2ea1ea13eb8e5f1b5277ac3606fc4211984b7d91/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.864910 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.864981 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.865464 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ca54d7-1a20-4f31-9f23-a1febc785c3d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.866310 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49wg8\" (UniqueName: \"kubernetes.io/projected/78ca54d7-1a20-4f31-9f23-a1febc785c3d-kube-api-access-49wg8\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: W1201 11:25:06.875687 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda058a864_a8ee_43b1_8662_688c39309094.slice/crio-f03dfa0943cf8d950048833906cf806502eef665b57cd63f93bc61b769fa1c7e WatchSource:0}: Error finding container f03dfa0943cf8d950048833906cf806502eef665b57cd63f93bc61b769fa1c7e: Status 404 returned error can't find the container with id f03dfa0943cf8d950048833906cf806502eef665b57cd63f93bc61b769fa1c7e Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.904836 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e2c3ee7c-8c51-427b-a060-cec0a9808446\") pod \"openstack-cell1-galera-0\" (UID: \"78ca54d7-1a20-4f31-9f23-a1febc785c3d\") " pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:06 crc kubenswrapper[4958]: I1201 11:25:06.932508 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:07 crc kubenswrapper[4958]: I1201 11:25:07.218369 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05601e59-50a7-45b6-a41d-d872a023490b","Type":"ContainerStarted","Data":"598d9623056c7ec649e3bf3bd7c7989e538b759b78631a80aa6bd18eef23e67a"} Dec 01 11:25:07 crc kubenswrapper[4958]: I1201 11:25:07.253988 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a058a864-a8ee-43b1-8662-688c39309094","Type":"ContainerStarted","Data":"f03dfa0943cf8d950048833906cf806502eef665b57cd63f93bc61b769fa1c7e"} Dec 01 11:25:07 crc kubenswrapper[4958]: I1201 11:25:07.444056 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 11:25:07 crc kubenswrapper[4958]: I1201 11:25:07.797653 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:25:07 crc kubenswrapper[4958]: E1201 11:25:07.797951 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.259530 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ead29d35-e6ff-421e-82a8-832a0ae919eb","Type":"ContainerStarted","Data":"819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0"} Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.261485 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78ca54d7-1a20-4f31-9f23-a1febc785c3d","Type":"ContainerStarted","Data":"e3d7823f6b888e159187633b99a961bcb4a8c95d6b11ecf0d9e52da8c7a89738"} Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.261532 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78ca54d7-1a20-4f31-9f23-a1febc785c3d","Type":"ContainerStarted","Data":"40891e8f5084683826fe47ab2693c85ea0437c4766c47166b66c7cc155bc092b"} Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.263288 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6","Type":"ContainerStarted","Data":"7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303"} Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.266998 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a058a864-a8ee-43b1-8662-688c39309094","Type":"ContainerStarted","Data":"e1fba6f0e5aada85e0be0fb3356ba5a702caeaf8f209830d9d387ea6a4380d7f"} Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.267048 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 11:25:08 crc kubenswrapper[4958]: I1201 11:25:08.387976 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.3879546019999998 podStartE2EDuration="3.387954602s" podCreationTimestamp="2025-12-01 11:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:08.385516673 +0000 UTC m=+5155.894305720" watchObservedRunningTime="2025-12-01 11:25:08.387954602 +0000 UTC m=+5155.896743649" Dec 01 11:25:11 crc kubenswrapper[4958]: I1201 11:25:11.295463 4958 generic.go:334] "Generic (PLEG): container finished" podID="78ca54d7-1a20-4f31-9f23-a1febc785c3d" containerID="e3d7823f6b888e159187633b99a961bcb4a8c95d6b11ecf0d9e52da8c7a89738" exitCode=0 Dec 01 11:25:11 crc kubenswrapper[4958]: I1201 11:25:11.295547 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78ca54d7-1a20-4f31-9f23-a1febc785c3d","Type":"ContainerDied","Data":"e3d7823f6b888e159187633b99a961bcb4a8c95d6b11ecf0d9e52da8c7a89738"} Dec 01 11:25:11 crc kubenswrapper[4958]: I1201 11:25:11.298650 4958 generic.go:334] "Generic (PLEG): container finished" podID="05601e59-50a7-45b6-a41d-d872a023490b" containerID="598d9623056c7ec649e3bf3bd7c7989e538b759b78631a80aa6bd18eef23e67a" exitCode=0 Dec 01 11:25:11 crc kubenswrapper[4958]: I1201 11:25:11.298682 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05601e59-50a7-45b6-a41d-d872a023490b","Type":"ContainerDied","Data":"598d9623056c7ec649e3bf3bd7c7989e538b759b78631a80aa6bd18eef23e67a"} Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.308026 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"05601e59-50a7-45b6-a41d-d872a023490b","Type":"ContainerStarted","Data":"6be701fb3c405c2876160ee82859b3e21609078cf6a9b66f8ca72f30d3aee182"} Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.310350 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78ca54d7-1a20-4f31-9f23-a1febc785c3d","Type":"ContainerStarted","Data":"5a9983284e1bd5165554d979f43929669a7047c05a6539c50b760920fb178afa"} Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.340002 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.339976011 podStartE2EDuration="9.339976011s" podCreationTimestamp="2025-12-01 11:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:12.338937312 +0000 UTC m=+5159.847726359" watchObservedRunningTime="2025-12-01 11:25:12.339976011 +0000 UTC m=+5159.848765068" Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.376670 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.37664626 podStartE2EDuration="7.37664626s" podCreationTimestamp="2025-12-01 11:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:12.366980536 +0000 UTC m=+5159.875769583" watchObservedRunningTime="2025-12-01 11:25:12.37664626 +0000 UTC m=+5159.885435317" Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.838414 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.897208 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:12 crc kubenswrapper[4958]: I1201 11:25:12.916870 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-h4j7j"] Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.320145 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerName="dnsmasq-dns" containerID="cri-o://f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433" gracePeriod=10 Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.777165 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.846560 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-dns-svc\") pod \"01594c3a-61db-4578-b3f8-5282a9c490ba\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.846629 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbdjt\" (UniqueName: \"kubernetes.io/projected/01594c3a-61db-4578-b3f8-5282a9c490ba-kube-api-access-wbdjt\") pod \"01594c3a-61db-4578-b3f8-5282a9c490ba\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.846734 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-config\") pod \"01594c3a-61db-4578-b3f8-5282a9c490ba\" (UID: \"01594c3a-61db-4578-b3f8-5282a9c490ba\") " Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.852043 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01594c3a-61db-4578-b3f8-5282a9c490ba-kube-api-access-wbdjt" (OuterVolumeSpecName: "kube-api-access-wbdjt") pod "01594c3a-61db-4578-b3f8-5282a9c490ba" (UID: "01594c3a-61db-4578-b3f8-5282a9c490ba"). InnerVolumeSpecName "kube-api-access-wbdjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.883537 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-config" (OuterVolumeSpecName: "config") pod "01594c3a-61db-4578-b3f8-5282a9c490ba" (UID: "01594c3a-61db-4578-b3f8-5282a9c490ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.897781 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01594c3a-61db-4578-b3f8-5282a9c490ba" (UID: "01594c3a-61db-4578-b3f8-5282a9c490ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.949362 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.949598 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbdjt\" (UniqueName: \"kubernetes.io/projected/01594c3a-61db-4578-b3f8-5282a9c490ba-kube-api-access-wbdjt\") on node \"crc\" DevicePath \"\"" Dec 01 11:25:13 crc kubenswrapper[4958]: I1201 11:25:13.949694 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01594c3a-61db-4578-b3f8-5282a9c490ba-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.333335 4958 generic.go:334] "Generic (PLEG): container finished" podID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerID="f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433" exitCode=0 Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.333455 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.333494 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" event={"ID":"01594c3a-61db-4578-b3f8-5282a9c490ba","Type":"ContainerDied","Data":"f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433"} Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.335558 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-h4j7j" event={"ID":"01594c3a-61db-4578-b3f8-5282a9c490ba","Type":"ContainerDied","Data":"ff063d3b587441e5f233f0ad5e1a3e6dbef6ef52ef4205a7770a5a28cd900dd4"} Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.335591 4958 scope.go:117] "RemoveContainer" containerID="f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.373734 4958 scope.go:117] "RemoveContainer" containerID="94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.385693 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-h4j7j"] Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.397894 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-h4j7j"] Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.409459 4958 scope.go:117] "RemoveContainer" containerID="f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433" Dec 01 11:25:14 crc kubenswrapper[4958]: E1201 11:25:14.411919 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433\": container with ID starting with f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433 not found: ID does not exist" containerID="f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.412019 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433"} err="failed to get container status \"f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433\": rpc error: code = NotFound desc = could not find container \"f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433\": container with ID starting with f78802e1067be3ba718cfdc4afce34a3237ca0012f41657c418ae9a46d874433 not found: ID does not exist" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.412070 4958 scope.go:117] "RemoveContainer" containerID="94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189" Dec 01 11:25:14 crc kubenswrapper[4958]: E1201 11:25:14.412870 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189\": container with ID starting with 94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189 not found: ID does not exist" containerID="94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189" Dec 01 11:25:14 crc kubenswrapper[4958]: I1201 11:25:14.412937 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189"} err="failed to get container status \"94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189\": rpc error: code = NotFound desc = could not find container \"94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189\": container with ID starting with 94de3c08804648c6950b7cf33c5e45a596ff3d408256ebd0ba90c816c901d189 not found: ID does not exist" Dec 01 11:25:15 crc kubenswrapper[4958]: I1201 11:25:15.562781 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 11:25:15 crc kubenswrapper[4958]: I1201 11:25:15.563196 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 11:25:15 crc kubenswrapper[4958]: I1201 11:25:15.566385 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 11:25:15 crc kubenswrapper[4958]: I1201 11:25:15.809352 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" path="/var/lib/kubelet/pods/01594c3a-61db-4578-b3f8-5282a9c490ba/volumes" Dec 01 11:25:16 crc kubenswrapper[4958]: I1201 11:25:16.933300 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:16 crc kubenswrapper[4958]: I1201 11:25:16.933361 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:17 crc kubenswrapper[4958]: I1201 11:25:17.649473 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 11:25:17 crc kubenswrapper[4958]: I1201 11:25:17.738124 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 11:25:19 crc kubenswrapper[4958]: I1201 11:25:19.010783 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:19 crc kubenswrapper[4958]: I1201 11:25:19.066665 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 11:25:19 crc kubenswrapper[4958]: I1201 11:25:19.798249 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:25:19 crc kubenswrapper[4958]: E1201 11:25:19.798986 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:25:22 crc kubenswrapper[4958]: I1201 11:25:22.648558 4958 scope.go:117] "RemoveContainer" containerID="70a0c0622301d5bef648b9bce9274d82d56335ac131dc5a4ea5186895b7cedf4" Dec 01 11:25:22 crc kubenswrapper[4958]: I1201 11:25:22.682808 4958 scope.go:117] "RemoveContainer" containerID="b4b40a5b3119f4ace032ce3ee48c6cec0accd5e209455782f63444a2d9216f60" Dec 01 11:25:22 crc kubenswrapper[4958]: I1201 11:25:22.732479 4958 scope.go:117] "RemoveContainer" containerID="8ffddfce18fb186d88f4faee8b72fdf4ee7a2e719d505d624fa3501777106036" Dec 01 11:25:33 crc kubenswrapper[4958]: I1201 11:25:33.803659 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:25:33 crc kubenswrapper[4958]: E1201 11:25:33.804599 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:25:40 crc kubenswrapper[4958]: I1201 11:25:40.544653 4958 generic.go:334] "Generic (PLEG): container finished" podID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerID="7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303" exitCode=0 Dec 01 11:25:40 crc kubenswrapper[4958]: I1201 11:25:40.544790 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6","Type":"ContainerDied","Data":"7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303"} Dec 01 11:25:40 crc kubenswrapper[4958]: I1201 11:25:40.547224 4958 generic.go:334] "Generic (PLEG): container finished" podID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerID="819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0" exitCode=0 Dec 01 11:25:40 crc kubenswrapper[4958]: I1201 11:25:40.547239 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ead29d35-e6ff-421e-82a8-832a0ae919eb","Type":"ContainerDied","Data":"819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0"} Dec 01 11:25:41 crc kubenswrapper[4958]: I1201 11:25:41.560616 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6","Type":"ContainerStarted","Data":"74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc"} Dec 01 11:25:41 crc kubenswrapper[4958]: I1201 11:25:41.562377 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 11:25:41 crc kubenswrapper[4958]: I1201 11:25:41.565556 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ead29d35-e6ff-421e-82a8-832a0ae919eb","Type":"ContainerStarted","Data":"27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0"} Dec 01 11:25:41 crc kubenswrapper[4958]: I1201 11:25:41.566184 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:41 crc kubenswrapper[4958]: I1201 11:25:41.620454 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.620437368 podStartE2EDuration="39.620437368s" podCreationTimestamp="2025-12-01 11:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:41.618572425 +0000 UTC m=+5189.127361472" watchObservedRunningTime="2025-12-01 11:25:41.620437368 +0000 UTC m=+5189.129226405" Dec 01 11:25:41 crc kubenswrapper[4958]: I1201 11:25:41.621780 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.621770385 podStartE2EDuration="39.621770385s" podCreationTimestamp="2025-12-01 11:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:25:41.588645747 +0000 UTC m=+5189.097434874" watchObservedRunningTime="2025-12-01 11:25:41.621770385 +0000 UTC m=+5189.130559422" Dec 01 11:25:46 crc kubenswrapper[4958]: I1201 11:25:46.798145 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:25:46 crc kubenswrapper[4958]: E1201 11:25:46.799263 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:25:54 crc kubenswrapper[4958]: I1201 11:25:54.049089 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 11:25:54 crc kubenswrapper[4958]: I1201 11:25:54.104294 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.727906 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pm49z"] Dec 01 11:25:59 crc kubenswrapper[4958]: E1201 11:25:59.730203 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerName="dnsmasq-dns" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.730443 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerName="dnsmasq-dns" Dec 01 11:25:59 crc kubenswrapper[4958]: E1201 11:25:59.730545 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerName="init" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.730612 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerName="init" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.730891 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="01594c3a-61db-4578-b3f8-5282a9c490ba" containerName="dnsmasq-dns" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.732185 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.735984 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pm49z"] Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.864686 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgq6m\" (UniqueName: \"kubernetes.io/projected/1c4a10a2-79f1-40f5-9061-0c0d8626a150-kube-api-access-jgq6m\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.864782 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-config\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.864972 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.966935 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-config\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.966997 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.967151 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgq6m\" (UniqueName: \"kubernetes.io/projected/1c4a10a2-79f1-40f5-9061-0c0d8626a150-kube-api-access-jgq6m\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.968251 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-config\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.968288 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:25:59 crc kubenswrapper[4958]: I1201 11:25:59.995019 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgq6m\" (UniqueName: \"kubernetes.io/projected/1c4a10a2-79f1-40f5-9061-0c0d8626a150-kube-api-access-jgq6m\") pod \"dnsmasq-dns-5b7946d7b9-pm49z\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:26:00 crc kubenswrapper[4958]: I1201 11:26:00.054508 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:26:00 crc kubenswrapper[4958]: I1201 11:26:00.401225 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:26:00 crc kubenswrapper[4958]: I1201 11:26:00.487912 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pm49z"] Dec 01 11:26:00 crc kubenswrapper[4958]: I1201 11:26:00.797485 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:26:00 crc kubenswrapper[4958]: E1201 11:26:00.797943 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:26:01 crc kubenswrapper[4958]: I1201 11:26:01.269756 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:26:01 crc kubenswrapper[4958]: I1201 11:26:01.749258 4958 generic.go:334] "Generic (PLEG): container finished" podID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerID="da77a2be8453461a4e3955f3670ef87a6cbd8fee93e8cccf261f766b31c957b2" exitCode=0 Dec 01 11:26:01 crc kubenswrapper[4958]: I1201 11:26:01.749308 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" event={"ID":"1c4a10a2-79f1-40f5-9061-0c0d8626a150","Type":"ContainerDied","Data":"da77a2be8453461a4e3955f3670ef87a6cbd8fee93e8cccf261f766b31c957b2"} Dec 01 11:26:01 crc kubenswrapper[4958]: I1201 11:26:01.749346 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" event={"ID":"1c4a10a2-79f1-40f5-9061-0c0d8626a150","Type":"ContainerStarted","Data":"a9dbc4e4d00f537f935cd9eadd8d528422d5a4555a2b6023985043c7e4793682"} Dec 01 11:26:02 crc kubenswrapper[4958]: I1201 11:26:02.661571 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="rabbitmq" containerID="cri-o://74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc" gracePeriod=604798 Dec 01 11:26:02 crc kubenswrapper[4958]: I1201 11:26:02.767227 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" event={"ID":"1c4a10a2-79f1-40f5-9061-0c0d8626a150","Type":"ContainerStarted","Data":"e0a76210329b60b9b44cfb4873ba156cecc26ae8030b934f643d8f34e9621ab7"} Dec 01 11:26:02 crc kubenswrapper[4958]: I1201 11:26:02.767608 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:26:02 crc kubenswrapper[4958]: I1201 11:26:02.798944 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" podStartSLOduration=3.798820794 podStartE2EDuration="3.798820794s" podCreationTimestamp="2025-12-01 11:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:26:02.793066031 +0000 UTC m=+5210.301855068" watchObservedRunningTime="2025-12-01 11:26:02.798820794 +0000 UTC m=+5210.307609851" Dec 01 11:26:03 crc kubenswrapper[4958]: I1201 11:26:03.222009 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="rabbitmq" containerID="cri-o://27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0" gracePeriod=604799 Dec 01 11:26:04 crc kubenswrapper[4958]: I1201 11:26:04.046068 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5672: connect: connection refused" Dec 01 11:26:04 crc kubenswrapper[4958]: I1201 11:26:04.101936 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.381085 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499233 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-confd\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499578 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z7db\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-kube-api-access-7z7db\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499634 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-server-conf\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499659 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-pod-info\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499690 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-erlang-cookie\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499727 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-plugins\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.499970 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.500022 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-plugins-conf\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.500056 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-erlang-cookie-secret\") pod \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\" (UID: \"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.501093 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.501731 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.501937 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.515346 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-pod-info" (OuterVolumeSpecName: "pod-info") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.515367 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-kube-api-access-7z7db" (OuterVolumeSpecName: "kube-api-access-7z7db") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "kube-api-access-7z7db". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.519213 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.533982 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2" (OuterVolumeSpecName: "persistence") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "pvc-1430cf56-800c-45ee-ad70-952e5ef338d2". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.555708 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-server-conf" (OuterVolumeSpecName: "server-conf") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601311 4958 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601345 4958 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601384 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z7db\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-kube-api-access-7z7db\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601395 4958 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601404 4958 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601414 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601422 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.601449 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") on node \"crc\" " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.602626 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" (UID: "1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.618276 4958 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.618438 4958 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1430cf56-800c-45ee-ad70-952e5ef338d2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2") on node "crc" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.703299 4958 reconciler_common.go:293] "Volume detached for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.703345 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.796691 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.840881 4958 generic.go:334] "Generic (PLEG): container finished" podID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerID="27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0" exitCode=0 Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.840949 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ead29d35-e6ff-421e-82a8-832a0ae919eb","Type":"ContainerDied","Data":"27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0"} Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.840981 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ead29d35-e6ff-421e-82a8-832a0ae919eb","Type":"ContainerDied","Data":"fd41f32554111ec9f5c2cb9bd0251aa0c7399a9efbeae2dd8c068a0995527ed2"} Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.841001 4958 scope.go:117] "RemoveContainer" containerID="27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.841135 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.850886 4958 generic.go:334] "Generic (PLEG): container finished" podID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerID="74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc" exitCode=0 Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.850966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6","Type":"ContainerDied","Data":"74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc"} Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.851011 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6","Type":"ContainerDied","Data":"6b9e7c8969d9c33e3ea6d439dfbbcd884b78c09b9fd0c56d9c52641749b7e4a6"} Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.851024 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.870145 4958 scope.go:117] "RemoveContainer" containerID="819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905085 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-plugins\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905307 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905393 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-erlang-cookie\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905421 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqbcc\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-kube-api-access-wqbcc\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905466 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-plugins-conf\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905487 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ead29d35-e6ff-421e-82a8-832a0ae919eb-pod-info\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905537 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ead29d35-e6ff-421e-82a8-832a0ae919eb-erlang-cookie-secret\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905563 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-server-conf\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.905615 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-confd\") pod \"ead29d35-e6ff-421e-82a8-832a0ae919eb\" (UID: \"ead29d35-e6ff-421e-82a8-832a0ae919eb\") " Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.906126 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.911364 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead29d35-e6ff-421e-82a8-832a0ae919eb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.914544 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.915328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.915987 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ead29d35-e6ff-421e-82a8-832a0ae919eb-pod-info" (OuterVolumeSpecName: "pod-info") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.930963 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.935167 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-kube-api-access-wqbcc" (OuterVolumeSpecName: "kube-api-access-wqbcc") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "kube-api-access-wqbcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.945748 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-server-conf" (OuterVolumeSpecName: "server-conf") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.946145 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631" (OuterVolumeSpecName: "persistence") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.957297 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962121 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:26:09 crc kubenswrapper[4958]: E1201 11:26:09.962547 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="rabbitmq" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962564 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="rabbitmq" Dec 01 11:26:09 crc kubenswrapper[4958]: E1201 11:26:09.962586 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="rabbitmq" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962593 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="rabbitmq" Dec 01 11:26:09 crc kubenswrapper[4958]: E1201 11:26:09.962606 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="setup-container" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962612 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="setup-container" Dec 01 11:26:09 crc kubenswrapper[4958]: E1201 11:26:09.962623 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="setup-container" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962628 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="setup-container" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962759 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" containerName="rabbitmq" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.962775 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" containerName="rabbitmq" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.963650 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.966725 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.966938 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wxr94" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.967092 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.967225 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.967367 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.978491 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.992317 4958 scope.go:117] "RemoveContainer" containerID="27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0" Dec 01 11:26:09 crc kubenswrapper[4958]: E1201 11:26:09.992759 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0\": container with ID starting with 27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0 not found: ID does not exist" containerID="27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.992907 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0"} err="failed to get container status \"27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0\": rpc error: code = NotFound desc = could not find container \"27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0\": container with ID starting with 27e3d539686a882a9e3b88897d6e0d1587e129958145c8cf8fa971e8730045c0 not found: ID does not exist" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.993044 4958 scope.go:117] "RemoveContainer" containerID="819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0" Dec 01 11:26:09 crc kubenswrapper[4958]: E1201 11:26:09.993406 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0\": container with ID starting with 819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0 not found: ID does not exist" containerID="819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.993540 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0"} err="failed to get container status \"819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0\": rpc error: code = NotFound desc = could not find container \"819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0\": container with ID starting with 819780a1bea231de28bb8ac9b76c549f4ba6c5d701f719a75f4caea4ecf9f2a0 not found: ID does not exist" Dec 01 11:26:09 crc kubenswrapper[4958]: I1201 11:26:09.993646 4958 scope.go:117] "RemoveContainer" containerID="74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.007794 4958 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.007933 4958 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ead29d35-e6ff-421e-82a8-832a0ae919eb-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.007947 4958 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ead29d35-e6ff-421e-82a8-832a0ae919eb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.007959 4958 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ead29d35-e6ff-421e-82a8-832a0ae919eb-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.007991 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.008068 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") on node \"crc\" " Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.008089 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.008103 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqbcc\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-kube-api-access-wqbcc\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.035261 4958 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.035414 4958 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631") on node "crc" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.042270 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ead29d35-e6ff-421e-82a8-832a0ae919eb" (UID: "ead29d35-e6ff-421e-82a8-832a0ae919eb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.058096 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.059425 4958 scope.go:117] "RemoveContainer" containerID="7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.094376 4958 scope.go:117] "RemoveContainer" containerID="74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc" Dec 01 11:26:10 crc kubenswrapper[4958]: E1201 11:26:10.094796 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc\": container with ID starting with 74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc not found: ID does not exist" containerID="74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.094850 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc"} err="failed to get container status \"74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc\": rpc error: code = NotFound desc = could not find container \"74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc\": container with ID starting with 74d79ad2c101a829440803cbf9720f74bccede44444c8d490e08b8bab068cdcc not found: ID does not exist" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.094872 4958 scope.go:117] "RemoveContainer" containerID="7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303" Dec 01 11:26:10 crc kubenswrapper[4958]: E1201 11:26:10.095366 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303\": container with ID starting with 7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303 not found: ID does not exist" containerID="7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.095400 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303"} err="failed to get container status \"7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303\": rpc error: code = NotFound desc = could not find container \"7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303\": container with ID starting with 7172a37260f1d46537247717c600b3237a0478558fbbb1b5e16d822a6d70f303 not found: ID does not exist" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.114902 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.114975 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115021 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d74180b-48bb-4694-bcb0-4ca03cf9d396-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115046 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115075 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d74180b-48bb-4694-bcb0-4ca03cf9d396-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115110 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d74180b-48bb-4694-bcb0-4ca03cf9d396-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115138 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28lt\" (UniqueName: \"kubernetes.io/projected/6d74180b-48bb-4694-bcb0-4ca03cf9d396-kube-api-access-f28lt\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115172 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115249 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d74180b-48bb-4694-bcb0-4ca03cf9d396-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115305 4958 reconciler_common.go:293] "Volume detached for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.115321 4958 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ead29d35-e6ff-421e-82a8-832a0ae919eb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.116273 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-b2hn6"] Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.116464 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" podUID="1bab268c-f167-4733-adb1-38948a8833f2" containerName="dnsmasq-dns" containerID="cri-o://dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336" gracePeriod=10 Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.214636 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216527 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216577 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d74180b-48bb-4694-bcb0-4ca03cf9d396-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216612 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d74180b-48bb-4694-bcb0-4ca03cf9d396-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216638 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28lt\" (UniqueName: \"kubernetes.io/projected/6d74180b-48bb-4694-bcb0-4ca03cf9d396-kube-api-access-f28lt\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216672 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d74180b-48bb-4694-bcb0-4ca03cf9d396-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216863 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.216905 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d74180b-48bb-4694-bcb0-4ca03cf9d396-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.217136 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.217756 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d74180b-48bb-4694-bcb0-4ca03cf9d396-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.218156 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.218348 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d74180b-48bb-4694-bcb0-4ca03cf9d396-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.223263 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.223442 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d74180b-48bb-4694-bcb0-4ca03cf9d396-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.223474 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d74180b-48bb-4694-bcb0-4ca03cf9d396-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.224834 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d74180b-48bb-4694-bcb0-4ca03cf9d396-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.225116 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.225237 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/232d4b043d54b34e4f8229ea02114265740b3a89b60b1d1c21d659c1a226522f/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.242627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28lt\" (UniqueName: \"kubernetes.io/projected/6d74180b-48bb-4694-bcb0-4ca03cf9d396-kube-api-access-f28lt\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.249789 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.256019 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.260205 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7pbhp" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.260422 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.260528 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.260632 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.260725 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.269641 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.280037 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1430cf56-800c-45ee-ad70-952e5ef338d2\") pod \"rabbitmq-server-0\" (UID: \"6d74180b-48bb-4694-bcb0-4ca03cf9d396\") " pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319031 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfdt\" (UniqueName: \"kubernetes.io/projected/6a935748-46fe-4b48-b29e-6ba5adb44822-kube-api-access-ngfdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319322 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a935748-46fe-4b48-b29e-6ba5adb44822-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319382 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319410 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a935748-46fe-4b48-b29e-6ba5adb44822-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319430 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a935748-46fe-4b48-b29e-6ba5adb44822-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319562 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.319967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.320035 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a935748-46fe-4b48-b29e-6ba5adb44822-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.320310 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.344529 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423627 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423722 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfdt\" (UniqueName: \"kubernetes.io/projected/6a935748-46fe-4b48-b29e-6ba5adb44822-kube-api-access-ngfdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423755 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a935748-46fe-4b48-b29e-6ba5adb44822-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423779 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a935748-46fe-4b48-b29e-6ba5adb44822-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423819 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a935748-46fe-4b48-b29e-6ba5adb44822-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423854 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423878 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.423901 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a935748-46fe-4b48-b29e-6ba5adb44822-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.424466 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.425426 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a935748-46fe-4b48-b29e-6ba5adb44822-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.425651 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.426435 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a935748-46fe-4b48-b29e-6ba5adb44822-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.432655 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a935748-46fe-4b48-b29e-6ba5adb44822-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.445576 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a935748-46fe-4b48-b29e-6ba5adb44822-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.457032 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfdt\" (UniqueName: \"kubernetes.io/projected/6a935748-46fe-4b48-b29e-6ba5adb44822-kube-api-access-ngfdt\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.460426 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a935748-46fe-4b48-b29e-6ba5adb44822-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.545658 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.545713 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/64422f1485a4f12edd0674d161f7225abd2c33822b0e4e49cbffe0d3f821b510/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.659554 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.659555 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca0aa112-ede8-4fd8-8193-3ac542b44631\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a935748-46fe-4b48-b29e-6ba5adb44822\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.728964 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-config\") pod \"1bab268c-f167-4733-adb1-38948a8833f2\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.729053 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6q64\" (UniqueName: \"kubernetes.io/projected/1bab268c-f167-4733-adb1-38948a8833f2-kube-api-access-z6q64\") pod \"1bab268c-f167-4733-adb1-38948a8833f2\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.729165 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-dns-svc\") pod \"1bab268c-f167-4733-adb1-38948a8833f2\" (UID: \"1bab268c-f167-4733-adb1-38948a8833f2\") " Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.733615 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bab268c-f167-4733-adb1-38948a8833f2-kube-api-access-z6q64" (OuterVolumeSpecName: "kube-api-access-z6q64") pod "1bab268c-f167-4733-adb1-38948a8833f2" (UID: "1bab268c-f167-4733-adb1-38948a8833f2"). InnerVolumeSpecName "kube-api-access-z6q64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.765577 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-config" (OuterVolumeSpecName: "config") pod "1bab268c-f167-4733-adb1-38948a8833f2" (UID: "1bab268c-f167-4733-adb1-38948a8833f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.770141 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bab268c-f167-4733-adb1-38948a8833f2" (UID: "1bab268c-f167-4733-adb1-38948a8833f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.831310 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.831342 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6q64\" (UniqueName: \"kubernetes.io/projected/1bab268c-f167-4733-adb1-38948a8833f2-kube-api-access-z6q64\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.831352 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bab268c-f167-4733-adb1-38948a8833f2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.862539 4958 generic.go:334] "Generic (PLEG): container finished" podID="1bab268c-f167-4733-adb1-38948a8833f2" containerID="dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336" exitCode=0 Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.862661 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" event={"ID":"1bab268c-f167-4733-adb1-38948a8833f2","Type":"ContainerDied","Data":"dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336"} Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.862712 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" event={"ID":"1bab268c-f167-4733-adb1-38948a8833f2","Type":"ContainerDied","Data":"b0f9e2f1687f26363631d0785bc43cfe8fb45a38b671f9efa66f2b99cbb7f9fa"} Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.862743 4958 scope.go:117] "RemoveContainer" containerID="dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.862665 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-b2hn6" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.877216 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.896654 4958 scope.go:117] "RemoveContainer" containerID="5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de" Dec 01 11:26:10 crc kubenswrapper[4958]: I1201 11:26:10.966402 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 11:26:10 crc kubenswrapper[4958]: W1201 11:26:10.977869 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d74180b_48bb_4694_bcb0_4ca03cf9d396.slice/crio-47c849b12ff64764f82fd7f9813b1bd07ab5d350a5b30662e38df65c13528e1f WatchSource:0}: Error finding container 47c849b12ff64764f82fd7f9813b1bd07ab5d350a5b30662e38df65c13528e1f: Status 404 returned error can't find the container with id 47c849b12ff64764f82fd7f9813b1bd07ab5d350a5b30662e38df65c13528e1f Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.096995 4958 scope.go:117] "RemoveContainer" containerID="dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336" Dec 01 11:26:11 crc kubenswrapper[4958]: E1201 11:26:11.097590 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336\": container with ID starting with dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336 not found: ID does not exist" containerID="dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.097641 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336"} err="failed to get container status \"dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336\": rpc error: code = NotFound desc = could not find container \"dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336\": container with ID starting with dcd847e6fa5518f9168d33b682c67109ea3051280bbddd07cac73daf83c24336 not found: ID does not exist" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.097680 4958 scope.go:117] "RemoveContainer" containerID="5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de" Dec 01 11:26:11 crc kubenswrapper[4958]: E1201 11:26:11.098073 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de\": container with ID starting with 5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de not found: ID does not exist" containerID="5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.098118 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de"} err="failed to get container status \"5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de\": rpc error: code = NotFound desc = could not find container \"5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de\": container with ID starting with 5070d33e31ccace04779d4ed0b972efa418d7cd7a4333dbf8acf5ab15867b0de not found: ID does not exist" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.114175 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-b2hn6"] Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.129792 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-b2hn6"] Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.430146 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.807172 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bab268c-f167-4733-adb1-38948a8833f2" path="/var/lib/kubelet/pods/1bab268c-f167-4733-adb1-38948a8833f2/volumes" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.808471 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6" path="/var/lib/kubelet/pods/1dbed77a-cf9a-442d-bb6a-e7de1fad7ea6/volumes" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.810532 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead29d35-e6ff-421e-82a8-832a0ae919eb" path="/var/lib/kubelet/pods/ead29d35-e6ff-421e-82a8-832a0ae919eb/volumes" Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.874895 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a935748-46fe-4b48-b29e-6ba5adb44822","Type":"ContainerStarted","Data":"d20b007c1deeca2a41ec5d38992240cadbf606f80d2983bce4e05d52ba6ecb75"} Dec 01 11:26:11 crc kubenswrapper[4958]: I1201 11:26:11.876907 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d74180b-48bb-4694-bcb0-4ca03cf9d396","Type":"ContainerStarted","Data":"47c849b12ff64764f82fd7f9813b1bd07ab5d350a5b30662e38df65c13528e1f"} Dec 01 11:26:12 crc kubenswrapper[4958]: I1201 11:26:12.798763 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:26:12 crc kubenswrapper[4958]: E1201 11:26:12.801785 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:26:12 crc kubenswrapper[4958]: I1201 11:26:12.888320 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a935748-46fe-4b48-b29e-6ba5adb44822","Type":"ContainerStarted","Data":"3232db821f8a737d186733cf2dd62c0e7f03782b7617f0dc4ff18c72eb1c448e"} Dec 01 11:26:12 crc kubenswrapper[4958]: I1201 11:26:12.891746 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d74180b-48bb-4694-bcb0-4ca03cf9d396","Type":"ContainerStarted","Data":"c1134f87cb7e8aa637dc0c9ceace1e263e46b438da3e487688d02583459e3800"} Dec 01 11:26:23 crc kubenswrapper[4958]: I1201 11:26:23.805009 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:26:23 crc kubenswrapper[4958]: E1201 11:26:23.805829 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:26:36 crc kubenswrapper[4958]: I1201 11:26:36.797713 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:26:36 crc kubenswrapper[4958]: E1201 11:26:36.799467 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:26:46 crc kubenswrapper[4958]: I1201 11:26:46.226527 4958 generic.go:334] "Generic (PLEG): container finished" podID="6d74180b-48bb-4694-bcb0-4ca03cf9d396" containerID="c1134f87cb7e8aa637dc0c9ceace1e263e46b438da3e487688d02583459e3800" exitCode=0 Dec 01 11:26:46 crc kubenswrapper[4958]: I1201 11:26:46.226607 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d74180b-48bb-4694-bcb0-4ca03cf9d396","Type":"ContainerDied","Data":"c1134f87cb7e8aa637dc0c9ceace1e263e46b438da3e487688d02583459e3800"} Dec 01 11:26:46 crc kubenswrapper[4958]: I1201 11:26:46.231365 4958 generic.go:334] "Generic (PLEG): container finished" podID="6a935748-46fe-4b48-b29e-6ba5adb44822" containerID="3232db821f8a737d186733cf2dd62c0e7f03782b7617f0dc4ff18c72eb1c448e" exitCode=0 Dec 01 11:26:46 crc kubenswrapper[4958]: I1201 11:26:46.231436 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a935748-46fe-4b48-b29e-6ba5adb44822","Type":"ContainerDied","Data":"3232db821f8a737d186733cf2dd62c0e7f03782b7617f0dc4ff18c72eb1c448e"} Dec 01 11:26:47 crc kubenswrapper[4958]: I1201 11:26:47.242017 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d74180b-48bb-4694-bcb0-4ca03cf9d396","Type":"ContainerStarted","Data":"957a5d10d048b4c83541818193bcd96b39661ee8fa75c5872a9c6e58d6de5038"} Dec 01 11:26:47 crc kubenswrapper[4958]: I1201 11:26:47.242825 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 11:26:47 crc kubenswrapper[4958]: I1201 11:26:47.244330 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a935748-46fe-4b48-b29e-6ba5adb44822","Type":"ContainerStarted","Data":"b25e5b5f0f70446267b67c61bc4c620c19cb2e0a0114ebbbc1b6a5049fabf8c5"} Dec 01 11:26:47 crc kubenswrapper[4958]: I1201 11:26:47.244641 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:26:47 crc kubenswrapper[4958]: I1201 11:26:47.272298 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.272266849 podStartE2EDuration="38.272266849s" podCreationTimestamp="2025-12-01 11:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:26:47.265662682 +0000 UTC m=+5254.774451719" watchObservedRunningTime="2025-12-01 11:26:47.272266849 +0000 UTC m=+5254.781055896" Dec 01 11:26:49 crc kubenswrapper[4958]: I1201 11:26:49.797721 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:26:49 crc kubenswrapper[4958]: E1201 11:26:49.798651 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:27:00 crc kubenswrapper[4958]: I1201 11:27:00.349193 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 11:27:00 crc kubenswrapper[4958]: I1201 11:27:00.401059 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.401012873 podStartE2EDuration="50.401012873s" podCreationTimestamp="2025-12-01 11:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:26:47.311184701 +0000 UTC m=+5254.819973748" watchObservedRunningTime="2025-12-01 11:27:00.401012873 +0000 UTC m=+5267.909801920" Dec 01 11:27:00 crc kubenswrapper[4958]: I1201 11:27:00.881026 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 11:27:01 crc kubenswrapper[4958]: I1201 11:27:01.797784 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:27:01 crc kubenswrapper[4958]: E1201 11:27:01.799356 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:27:02 crc kubenswrapper[4958]: I1201 11:27:02.901302 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sh2qg"] Dec 01 11:27:02 crc kubenswrapper[4958]: E1201 11:27:02.902010 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bab268c-f167-4733-adb1-38948a8833f2" containerName="init" Dec 01 11:27:02 crc kubenswrapper[4958]: I1201 11:27:02.902027 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bab268c-f167-4733-adb1-38948a8833f2" containerName="init" Dec 01 11:27:02 crc kubenswrapper[4958]: E1201 11:27:02.902057 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bab268c-f167-4733-adb1-38948a8833f2" containerName="dnsmasq-dns" Dec 01 11:27:02 crc kubenswrapper[4958]: I1201 11:27:02.902069 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bab268c-f167-4733-adb1-38948a8833f2" containerName="dnsmasq-dns" Dec 01 11:27:02 crc kubenswrapper[4958]: I1201 11:27:02.902279 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bab268c-f167-4733-adb1-38948a8833f2" containerName="dnsmasq-dns" Dec 01 11:27:02 crc kubenswrapper[4958]: I1201 11:27:02.903784 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:02 crc kubenswrapper[4958]: I1201 11:27:02.923441 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh2qg"] Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.038326 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbzdg\" (UniqueName: \"kubernetes.io/projected/c9643935-3071-466f-9e2e-34c6c65bddfe-kube-api-access-cbzdg\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.038521 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-utilities\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.038628 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-catalog-content\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.140594 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-utilities\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.140676 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-catalog-content\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.140750 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbzdg\" (UniqueName: \"kubernetes.io/projected/c9643935-3071-466f-9e2e-34c6c65bddfe-kube-api-access-cbzdg\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.141373 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-utilities\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.141418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-catalog-content\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.166527 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbzdg\" (UniqueName: \"kubernetes.io/projected/c9643935-3071-466f-9e2e-34c6c65bddfe-kube-api-access-cbzdg\") pod \"redhat-operators-sh2qg\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.239036 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:03 crc kubenswrapper[4958]: I1201 11:27:03.492932 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh2qg"] Dec 01 11:27:04 crc kubenswrapper[4958]: I1201 11:27:04.407041 4958 generic.go:334] "Generic (PLEG): container finished" podID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerID="43efc916bc8daf23e0b05137328013deb91e609d7c50130495985ed79cf64203" exitCode=0 Dec 01 11:27:04 crc kubenswrapper[4958]: I1201 11:27:04.407176 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerDied","Data":"43efc916bc8daf23e0b05137328013deb91e609d7c50130495985ed79cf64203"} Dec 01 11:27:04 crc kubenswrapper[4958]: I1201 11:27:04.407488 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerStarted","Data":"286b12396b2ba19f3d3c2f8941deeab53051d1153a8964c5ec1fa0cdbd0a3aff"} Dec 01 11:27:04 crc kubenswrapper[4958]: I1201 11:27:04.410105 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:27:05 crc kubenswrapper[4958]: I1201 11:27:05.421554 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerStarted","Data":"e7eab481bb771da940fabbe48494da592b31b445b3848f78d608d8368e6c2370"} Dec 01 11:27:06 crc kubenswrapper[4958]: I1201 11:27:06.438392 4958 generic.go:334] "Generic (PLEG): container finished" podID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerID="e7eab481bb771da940fabbe48494da592b31b445b3848f78d608d8368e6c2370" exitCode=0 Dec 01 11:27:06 crc kubenswrapper[4958]: I1201 11:27:06.438461 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerDied","Data":"e7eab481bb771da940fabbe48494da592b31b445b3848f78d608d8368e6c2370"} Dec 01 11:27:07 crc kubenswrapper[4958]: I1201 11:27:07.455541 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerStarted","Data":"997dd7d0b28c4735873d2be903978539381dcf0743d9f57e20f265deb3599341"} Dec 01 11:27:07 crc kubenswrapper[4958]: I1201 11:27:07.492210 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sh2qg" podStartSLOduration=2.969904778 podStartE2EDuration="5.492176402s" podCreationTimestamp="2025-12-01 11:27:02 +0000 UTC" firstStartedPulling="2025-12-01 11:27:04.409424799 +0000 UTC m=+5271.918213876" lastFinishedPulling="2025-12-01 11:27:06.931696433 +0000 UTC m=+5274.440485500" observedRunningTime="2025-12-01 11:27:07.481765837 +0000 UTC m=+5274.990554914" watchObservedRunningTime="2025-12-01 11:27:07.492176402 +0000 UTC m=+5275.000965479" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.240050 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.240790 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.315240 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.659359 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.709937 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh2qg"] Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.763874 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.764888 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.767590 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r6bqk" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.769828 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.785233 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92lb\" (UniqueName: \"kubernetes.io/projected/389ef713-1caa-439e-bd17-64c9649ec803-kube-api-access-f92lb\") pod \"mariadb-client-1-default\" (UID: \"389ef713-1caa-439e-bd17-64c9649ec803\") " pod="openstack/mariadb-client-1-default" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.803016 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:27:13 crc kubenswrapper[4958]: E1201 11:27:13.803338 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.886547 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92lb\" (UniqueName: \"kubernetes.io/projected/389ef713-1caa-439e-bd17-64c9649ec803-kube-api-access-f92lb\") pod \"mariadb-client-1-default\" (UID: \"389ef713-1caa-439e-bd17-64c9649ec803\") " pod="openstack/mariadb-client-1-default" Dec 01 11:27:13 crc kubenswrapper[4958]: I1201 11:27:13.912281 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92lb\" (UniqueName: \"kubernetes.io/projected/389ef713-1caa-439e-bd17-64c9649ec803-kube-api-access-f92lb\") pod \"mariadb-client-1-default\" (UID: \"389ef713-1caa-439e-bd17-64c9649ec803\") " pod="openstack/mariadb-client-1-default" Dec 01 11:27:14 crc kubenswrapper[4958]: I1201 11:27:14.095321 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r6bqk" Dec 01 11:27:14 crc kubenswrapper[4958]: I1201 11:27:14.102866 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 01 11:27:14 crc kubenswrapper[4958]: I1201 11:27:14.777814 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 01 11:27:14 crc kubenswrapper[4958]: W1201 11:27:14.784530 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod389ef713_1caa_439e_bd17_64c9649ec803.slice/crio-90d55ef6cbd07646a6f909df8b48572d878970a82e825121e3f5d0349ec50e8e WatchSource:0}: Error finding container 90d55ef6cbd07646a6f909df8b48572d878970a82e825121e3f5d0349ec50e8e: Status 404 returned error can't find the container with id 90d55ef6cbd07646a6f909df8b48572d878970a82e825121e3f5d0349ec50e8e Dec 01 11:27:15 crc kubenswrapper[4958]: I1201 11:27:15.539386 4958 generic.go:334] "Generic (PLEG): container finished" podID="389ef713-1caa-439e-bd17-64c9649ec803" containerID="278e2d5adfc41e35aea2a997b38492f0d38081915dd7cf85526ef3b49063c666" exitCode=0 Dec 01 11:27:15 crc kubenswrapper[4958]: I1201 11:27:15.539503 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"389ef713-1caa-439e-bd17-64c9649ec803","Type":"ContainerDied","Data":"278e2d5adfc41e35aea2a997b38492f0d38081915dd7cf85526ef3b49063c666"} Dec 01 11:27:15 crc kubenswrapper[4958]: I1201 11:27:15.539899 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"389ef713-1caa-439e-bd17-64c9649ec803","Type":"ContainerStarted","Data":"90d55ef6cbd07646a6f909df8b48572d878970a82e825121e3f5d0349ec50e8e"} Dec 01 11:27:15 crc kubenswrapper[4958]: I1201 11:27:15.540081 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sh2qg" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="registry-server" containerID="cri-o://997dd7d0b28c4735873d2be903978539381dcf0743d9f57e20f265deb3599341" gracePeriod=2 Dec 01 11:27:16 crc kubenswrapper[4958]: I1201 11:27:16.984387 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.144541 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f92lb\" (UniqueName: \"kubernetes.io/projected/389ef713-1caa-439e-bd17-64c9649ec803-kube-api-access-f92lb\") pod \"389ef713-1caa-439e-bd17-64c9649ec803\" (UID: \"389ef713-1caa-439e-bd17-64c9649ec803\") " Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.156235 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389ef713-1caa-439e-bd17-64c9649ec803-kube-api-access-f92lb" (OuterVolumeSpecName: "kube-api-access-f92lb") pod "389ef713-1caa-439e-bd17-64c9649ec803" (UID: "389ef713-1caa-439e-bd17-64c9649ec803"). InnerVolumeSpecName "kube-api-access-f92lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.248010 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f92lb\" (UniqueName: \"kubernetes.io/projected/389ef713-1caa-439e-bd17-64c9649ec803-kube-api-access-f92lb\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.395858 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_389ef713-1caa-439e-bd17-64c9649ec803/mariadb-client-1-default/0.log" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.429290 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.437099 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.563180 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d55ef6cbd07646a6f909df8b48572d878970a82e825121e3f5d0349ec50e8e" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.563304 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.568104 4958 generic.go:334] "Generic (PLEG): container finished" podID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerID="997dd7d0b28c4735873d2be903978539381dcf0743d9f57e20f265deb3599341" exitCode=0 Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.568173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerDied","Data":"997dd7d0b28c4735873d2be903978539381dcf0743d9f57e20f265deb3599341"} Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.809187 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389ef713-1caa-439e-bd17-64c9649ec803" path="/var/lib/kubelet/pods/389ef713-1caa-439e-bd17-64c9649ec803/volumes" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.857132 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 01 11:27:17 crc kubenswrapper[4958]: E1201 11:27:17.857959 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389ef713-1caa-439e-bd17-64c9649ec803" containerName="mariadb-client-1-default" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.857990 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="389ef713-1caa-439e-bd17-64c9649ec803" containerName="mariadb-client-1-default" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.858368 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="389ef713-1caa-439e-bd17-64c9649ec803" containerName="mariadb-client-1-default" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.863442 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.867299 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r6bqk" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.868442 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.956655 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:17 crc kubenswrapper[4958]: I1201 11:27:17.960924 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gq6m\" (UniqueName: \"kubernetes.io/projected/dee639bc-3701-4977-9d04-1a7c4d1176b3-kube-api-access-7gq6m\") pod \"mariadb-client-2-default\" (UID: \"dee639bc-3701-4977-9d04-1a7c4d1176b3\") " pod="openstack/mariadb-client-2-default" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.061931 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbzdg\" (UniqueName: \"kubernetes.io/projected/c9643935-3071-466f-9e2e-34c6c65bddfe-kube-api-access-cbzdg\") pod \"c9643935-3071-466f-9e2e-34c6c65bddfe\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.062015 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-utilities\") pod \"c9643935-3071-466f-9e2e-34c6c65bddfe\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.062149 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-catalog-content\") pod \"c9643935-3071-466f-9e2e-34c6c65bddfe\" (UID: \"c9643935-3071-466f-9e2e-34c6c65bddfe\") " Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.062559 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gq6m\" (UniqueName: \"kubernetes.io/projected/dee639bc-3701-4977-9d04-1a7c4d1176b3-kube-api-access-7gq6m\") pod \"mariadb-client-2-default\" (UID: \"dee639bc-3701-4977-9d04-1a7c4d1176b3\") " pod="openstack/mariadb-client-2-default" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.063936 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-utilities" (OuterVolumeSpecName: "utilities") pod "c9643935-3071-466f-9e2e-34c6c65bddfe" (UID: "c9643935-3071-466f-9e2e-34c6c65bddfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.070023 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9643935-3071-466f-9e2e-34c6c65bddfe-kube-api-access-cbzdg" (OuterVolumeSpecName: "kube-api-access-cbzdg") pod "c9643935-3071-466f-9e2e-34c6c65bddfe" (UID: "c9643935-3071-466f-9e2e-34c6c65bddfe"). InnerVolumeSpecName "kube-api-access-cbzdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.078367 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gq6m\" (UniqueName: \"kubernetes.io/projected/dee639bc-3701-4977-9d04-1a7c4d1176b3-kube-api-access-7gq6m\") pod \"mariadb-client-2-default\" (UID: \"dee639bc-3701-4977-9d04-1a7c4d1176b3\") " pod="openstack/mariadb-client-2-default" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.163686 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbzdg\" (UniqueName: \"kubernetes.io/projected/c9643935-3071-466f-9e2e-34c6c65bddfe-kube-api-access-cbzdg\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.163717 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.209094 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9643935-3071-466f-9e2e-34c6c65bddfe" (UID: "c9643935-3071-466f-9e2e-34c6c65bddfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.254428 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 01 11:27:18 crc kubenswrapper[4958]: I1201 11:27:18.264694 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9643935-3071-466f-9e2e-34c6c65bddfe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.579609 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh2qg" event={"ID":"c9643935-3071-466f-9e2e-34c6c65bddfe","Type":"ContainerDied","Data":"286b12396b2ba19f3d3c2f8941deeab53051d1153a8964c5ec1fa0cdbd0a3aff"} Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.579660 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh2qg" Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.579968 4958 scope.go:117] "RemoveContainer" containerID="997dd7d0b28c4735873d2be903978539381dcf0743d9f57e20f265deb3599341" Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.609058 4958 scope.go:117] "RemoveContainer" containerID="e7eab481bb771da940fabbe48494da592b31b445b3848f78d608d8368e6c2370" Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.630994 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh2qg"] Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.640901 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sh2qg"] Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:18.644912 4958 scope.go:117] "RemoveContainer" containerID="43efc916bc8daf23e0b05137328013deb91e609d7c50130495985ed79cf64203" Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:19.815007 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" path="/var/lib/kubelet/pods/c9643935-3071-466f-9e2e-34c6c65bddfe/volumes" Dec 01 11:27:19 crc kubenswrapper[4958]: I1201 11:27:19.836376 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 01 11:27:19 crc kubenswrapper[4958]: W1201 11:27:19.841604 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee639bc_3701_4977_9d04_1a7c4d1176b3.slice/crio-1474b131212082a2b257463496fe437308758171381cc5db54115d42f5e23c6b WatchSource:0}: Error finding container 1474b131212082a2b257463496fe437308758171381cc5db54115d42f5e23c6b: Status 404 returned error can't find the container with id 1474b131212082a2b257463496fe437308758171381cc5db54115d42f5e23c6b Dec 01 11:27:20 crc kubenswrapper[4958]: I1201 11:27:20.607966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"dee639bc-3701-4977-9d04-1a7c4d1176b3","Type":"ContainerStarted","Data":"9e735b1d505d442db7fae37e4dc51b628aa167567ff830189b6cc0641f3acb9f"} Dec 01 11:27:20 crc kubenswrapper[4958]: I1201 11:27:20.608446 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"dee639bc-3701-4977-9d04-1a7c4d1176b3","Type":"ContainerStarted","Data":"1474b131212082a2b257463496fe437308758171381cc5db54115d42f5e23c6b"} Dec 01 11:27:20 crc kubenswrapper[4958]: I1201 11:27:20.633150 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=3.6331212219999998 podStartE2EDuration="3.633121222s" podCreationTimestamp="2025-12-01 11:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:27:20.62919291 +0000 UTC m=+5288.137981947" watchObservedRunningTime="2025-12-01 11:27:20.633121222 +0000 UTC m=+5288.141910299" Dec 01 11:27:21 crc kubenswrapper[4958]: I1201 11:27:21.618031 4958 generic.go:334] "Generic (PLEG): container finished" podID="dee639bc-3701-4977-9d04-1a7c4d1176b3" containerID="9e735b1d505d442db7fae37e4dc51b628aa167567ff830189b6cc0641f3acb9f" exitCode=1 Dec 01 11:27:21 crc kubenswrapper[4958]: I1201 11:27:21.618115 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"dee639bc-3701-4977-9d04-1a7c4d1176b3","Type":"ContainerDied","Data":"9e735b1d505d442db7fae37e4dc51b628aa167567ff830189b6cc0641f3acb9f"} Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.072712 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.123753 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.131806 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.160684 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gq6m\" (UniqueName: \"kubernetes.io/projected/dee639bc-3701-4977-9d04-1a7c4d1176b3-kube-api-access-7gq6m\") pod \"dee639bc-3701-4977-9d04-1a7c4d1176b3\" (UID: \"dee639bc-3701-4977-9d04-1a7c4d1176b3\") " Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.167027 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee639bc-3701-4977-9d04-1a7c4d1176b3-kube-api-access-7gq6m" (OuterVolumeSpecName: "kube-api-access-7gq6m") pod "dee639bc-3701-4977-9d04-1a7c4d1176b3" (UID: "dee639bc-3701-4977-9d04-1a7c4d1176b3"). InnerVolumeSpecName "kube-api-access-7gq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.262391 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gq6m\" (UniqueName: \"kubernetes.io/projected/dee639bc-3701-4977-9d04-1a7c4d1176b3-kube-api-access-7gq6m\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.554751 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 01 11:27:23 crc kubenswrapper[4958]: E1201 11:27:23.555817 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee639bc-3701-4977-9d04-1a7c4d1176b3" containerName="mariadb-client-2-default" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.556035 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee639bc-3701-4977-9d04-1a7c4d1176b3" containerName="mariadb-client-2-default" Dec 01 11:27:23 crc kubenswrapper[4958]: E1201 11:27:23.556243 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="registry-server" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.556387 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="registry-server" Dec 01 11:27:23 crc kubenswrapper[4958]: E1201 11:27:23.556577 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="extract-utilities" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.556719 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="extract-utilities" Dec 01 11:27:23 crc kubenswrapper[4958]: E1201 11:27:23.556908 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="extract-content" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.557060 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="extract-content" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.557604 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9643935-3071-466f-9e2e-34c6c65bddfe" containerName="registry-server" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.557754 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee639bc-3701-4977-9d04-1a7c4d1176b3" containerName="mariadb-client-2-default" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.559307 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.565329 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.639743 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1474b131212082a2b257463496fe437308758171381cc5db54115d42f5e23c6b" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.640068 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.669092 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls7f8\" (UniqueName: \"kubernetes.io/projected/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff-kube-api-access-ls7f8\") pod \"mariadb-client-1\" (UID: \"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff\") " pod="openstack/mariadb-client-1" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.770862 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls7f8\" (UniqueName: \"kubernetes.io/projected/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff-kube-api-access-ls7f8\") pod \"mariadb-client-1\" (UID: \"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff\") " pod="openstack/mariadb-client-1" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.791997 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls7f8\" (UniqueName: \"kubernetes.io/projected/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff-kube-api-access-ls7f8\") pod \"mariadb-client-1\" (UID: \"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff\") " pod="openstack/mariadb-client-1" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.809408 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee639bc-3701-4977-9d04-1a7c4d1176b3" path="/var/lib/kubelet/pods/dee639bc-3701-4977-9d04-1a7c4d1176b3/volumes" Dec 01 11:27:23 crc kubenswrapper[4958]: I1201 11:27:23.892503 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 01 11:27:24 crc kubenswrapper[4958]: I1201 11:27:24.268512 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 01 11:27:24 crc kubenswrapper[4958]: W1201 11:27:24.269220 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c9d3651_0f2d_4c54_8f6c_56f324c3cfff.slice/crio-839c5f177915a51c5f71368b0b159af597523ed191f831b0a5ccb7782da33a0f WatchSource:0}: Error finding container 839c5f177915a51c5f71368b0b159af597523ed191f831b0a5ccb7782da33a0f: Status 404 returned error can't find the container with id 839c5f177915a51c5f71368b0b159af597523ed191f831b0a5ccb7782da33a0f Dec 01 11:27:24 crc kubenswrapper[4958]: I1201 11:27:24.649017 4958 generic.go:334] "Generic (PLEG): container finished" podID="2c9d3651-0f2d-4c54-8f6c-56f324c3cfff" containerID="41ce6b8f69be673c83334ebd5433d62b818c62f481264232c1638a350c339a6d" exitCode=0 Dec 01 11:27:24 crc kubenswrapper[4958]: I1201 11:27:24.649078 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff","Type":"ContainerDied","Data":"41ce6b8f69be673c83334ebd5433d62b818c62f481264232c1638a350c339a6d"} Dec 01 11:27:24 crc kubenswrapper[4958]: I1201 11:27:24.649374 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff","Type":"ContainerStarted","Data":"839c5f177915a51c5f71368b0b159af597523ed191f831b0a5ccb7782da33a0f"} Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.008111 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.038304 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_2c9d3651-0f2d-4c54-8f6c-56f324c3cfff/mariadb-client-1/0.log" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.079724 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.086891 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.108796 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls7f8\" (UniqueName: \"kubernetes.io/projected/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff-kube-api-access-ls7f8\") pod \"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff\" (UID: \"2c9d3651-0f2d-4c54-8f6c-56f324c3cfff\") " Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.117105 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff-kube-api-access-ls7f8" (OuterVolumeSpecName: "kube-api-access-ls7f8") pod "2c9d3651-0f2d-4c54-8f6c-56f324c3cfff" (UID: "2c9d3651-0f2d-4c54-8f6c-56f324c3cfff"). InnerVolumeSpecName "kube-api-access-ls7f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.211385 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls7f8\" (UniqueName: \"kubernetes.io/projected/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff-kube-api-access-ls7f8\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.550326 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 01 11:27:26 crc kubenswrapper[4958]: E1201 11:27:26.550892 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9d3651-0f2d-4c54-8f6c-56f324c3cfff" containerName="mariadb-client-1" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.550926 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9d3651-0f2d-4c54-8f6c-56f324c3cfff" containerName="mariadb-client-1" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.551191 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9d3651-0f2d-4c54-8f6c-56f324c3cfff" containerName="mariadb-client-1" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.552109 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.566961 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.619358 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4mgs\" (UniqueName: \"kubernetes.io/projected/88f53889-c80b-42f8-abda-2d6fc4d30e18-kube-api-access-v4mgs\") pod \"mariadb-client-4-default\" (UID: \"88f53889-c80b-42f8-abda-2d6fc4d30e18\") " pod="openstack/mariadb-client-4-default" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.671066 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839c5f177915a51c5f71368b0b159af597523ed191f831b0a5ccb7782da33a0f" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.671136 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.721882 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4mgs\" (UniqueName: \"kubernetes.io/projected/88f53889-c80b-42f8-abda-2d6fc4d30e18-kube-api-access-v4mgs\") pod \"mariadb-client-4-default\" (UID: \"88f53889-c80b-42f8-abda-2d6fc4d30e18\") " pod="openstack/mariadb-client-4-default" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.761353 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4mgs\" (UniqueName: \"kubernetes.io/projected/88f53889-c80b-42f8-abda-2d6fc4d30e18-kube-api-access-v4mgs\") pod \"mariadb-client-4-default\" (UID: \"88f53889-c80b-42f8-abda-2d6fc4d30e18\") " pod="openstack/mariadb-client-4-default" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.797837 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:27:26 crc kubenswrapper[4958]: E1201 11:27:26.798315 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:27:26 crc kubenswrapper[4958]: I1201 11:27:26.883746 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 01 11:27:27 crc kubenswrapper[4958]: I1201 11:27:27.266334 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 01 11:27:27 crc kubenswrapper[4958]: I1201 11:27:27.684983 4958 generic.go:334] "Generic (PLEG): container finished" podID="88f53889-c80b-42f8-abda-2d6fc4d30e18" containerID="57fa0953e730b318253e6ea0dbed922e8cfabca9448c2737109a69fc38c63272" exitCode=0 Dec 01 11:27:27 crc kubenswrapper[4958]: I1201 11:27:27.685103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"88f53889-c80b-42f8-abda-2d6fc4d30e18","Type":"ContainerDied","Data":"57fa0953e730b318253e6ea0dbed922e8cfabca9448c2737109a69fc38c63272"} Dec 01 11:27:27 crc kubenswrapper[4958]: I1201 11:27:27.685295 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"88f53889-c80b-42f8-abda-2d6fc4d30e18","Type":"ContainerStarted","Data":"258ca69399b12d472dccb28cd718e92a0d7b5f92e0417a5eb186921bc9db70ec"} Dec 01 11:27:27 crc kubenswrapper[4958]: I1201 11:27:27.816099 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9d3651-0f2d-4c54-8f6c-56f324c3cfff" path="/var/lib/kubelet/pods/2c9d3651-0f2d-4c54-8f6c-56f324c3cfff/volumes" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.258825 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.279303 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_88f53889-c80b-42f8-abda-2d6fc4d30e18/mariadb-client-4-default/0.log" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.308391 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.316602 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.370834 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4mgs\" (UniqueName: \"kubernetes.io/projected/88f53889-c80b-42f8-abda-2d6fc4d30e18-kube-api-access-v4mgs\") pod \"88f53889-c80b-42f8-abda-2d6fc4d30e18\" (UID: \"88f53889-c80b-42f8-abda-2d6fc4d30e18\") " Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.376978 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f53889-c80b-42f8-abda-2d6fc4d30e18-kube-api-access-v4mgs" (OuterVolumeSpecName: "kube-api-access-v4mgs") pod "88f53889-c80b-42f8-abda-2d6fc4d30e18" (UID: "88f53889-c80b-42f8-abda-2d6fc4d30e18"). InnerVolumeSpecName "kube-api-access-v4mgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.472831 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4mgs\" (UniqueName: \"kubernetes.io/projected/88f53889-c80b-42f8-abda-2d6fc4d30e18-kube-api-access-v4mgs\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.719458 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258ca69399b12d472dccb28cd718e92a0d7b5f92e0417a5eb186921bc9db70ec" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.719490 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 01 11:27:29 crc kubenswrapper[4958]: I1201 11:27:29.814931 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f53889-c80b-42f8-abda-2d6fc4d30e18" path="/var/lib/kubelet/pods/88f53889-c80b-42f8-abda-2d6fc4d30e18/volumes" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.104710 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 01 11:27:33 crc kubenswrapper[4958]: E1201 11:27:33.106071 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f53889-c80b-42f8-abda-2d6fc4d30e18" containerName="mariadb-client-4-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.106103 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f53889-c80b-42f8-abda-2d6fc4d30e18" containerName="mariadb-client-4-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.106515 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f53889-c80b-42f8-abda-2d6fc4d30e18" containerName="mariadb-client-4-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.107623 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.111345 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r6bqk" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.119911 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.145207 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8vt\" (UniqueName: \"kubernetes.io/projected/82f9a687-30c0-457d-8b7e-d404bae15865-kube-api-access-5s8vt\") pod \"mariadb-client-5-default\" (UID: \"82f9a687-30c0-457d-8b7e-d404bae15865\") " pod="openstack/mariadb-client-5-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.246550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8vt\" (UniqueName: \"kubernetes.io/projected/82f9a687-30c0-457d-8b7e-d404bae15865-kube-api-access-5s8vt\") pod \"mariadb-client-5-default\" (UID: \"82f9a687-30c0-457d-8b7e-d404bae15865\") " pod="openstack/mariadb-client-5-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.270805 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8vt\" (UniqueName: \"kubernetes.io/projected/82f9a687-30c0-457d-8b7e-d404bae15865-kube-api-access-5s8vt\") pod \"mariadb-client-5-default\" (UID: \"82f9a687-30c0-457d-8b7e-d404bae15865\") " pod="openstack/mariadb-client-5-default" Dec 01 11:27:33 crc kubenswrapper[4958]: I1201 11:27:33.441523 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 01 11:27:34 crc kubenswrapper[4958]: I1201 11:27:34.055475 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 01 11:27:34 crc kubenswrapper[4958]: I1201 11:27:34.769947 4958 generic.go:334] "Generic (PLEG): container finished" podID="82f9a687-30c0-457d-8b7e-d404bae15865" containerID="ffa06db896bf51dc3b6eff02ad8700cf3855ea2c79d24d6d8a7c22f7d3d00603" exitCode=0 Dec 01 11:27:34 crc kubenswrapper[4958]: I1201 11:27:34.770045 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"82f9a687-30c0-457d-8b7e-d404bae15865","Type":"ContainerDied","Data":"ffa06db896bf51dc3b6eff02ad8700cf3855ea2c79d24d6d8a7c22f7d3d00603"} Dec 01 11:27:34 crc kubenswrapper[4958]: I1201 11:27:34.770453 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"82f9a687-30c0-457d-8b7e-d404bae15865","Type":"ContainerStarted","Data":"942d9b2365afe390f2d38d29f37925f7b2de8164923aa03c69c74bf501080cba"} Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.289889 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.305764 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s8vt\" (UniqueName: \"kubernetes.io/projected/82f9a687-30c0-457d-8b7e-d404bae15865-kube-api-access-5s8vt\") pod \"82f9a687-30c0-457d-8b7e-d404bae15865\" (UID: \"82f9a687-30c0-457d-8b7e-d404bae15865\") " Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.312562 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_82f9a687-30c0-457d-8b7e-d404bae15865/mariadb-client-5-default/0.log" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.316496 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f9a687-30c0-457d-8b7e-d404bae15865-kube-api-access-5s8vt" (OuterVolumeSpecName: "kube-api-access-5s8vt") pod "82f9a687-30c0-457d-8b7e-d404bae15865" (UID: "82f9a687-30c0-457d-8b7e-d404bae15865"). InnerVolumeSpecName "kube-api-access-5s8vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.395015 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.403822 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.408760 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s8vt\" (UniqueName: \"kubernetes.io/projected/82f9a687-30c0-457d-8b7e-d404bae15865-kube-api-access-5s8vt\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.507093 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 01 11:27:36 crc kubenswrapper[4958]: E1201 11:27:36.507609 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f9a687-30c0-457d-8b7e-d404bae15865" containerName="mariadb-client-5-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.507633 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f9a687-30c0-457d-8b7e-d404bae15865" containerName="mariadb-client-5-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.507996 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f9a687-30c0-457d-8b7e-d404bae15865" containerName="mariadb-client-5-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.509113 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.515610 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.612394 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvh54\" (UniqueName: \"kubernetes.io/projected/fd89878b-aa73-4899-896b-6556c8930d7d-kube-api-access-gvh54\") pod \"mariadb-client-6-default\" (UID: \"fd89878b-aa73-4899-896b-6556c8930d7d\") " pod="openstack/mariadb-client-6-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.714457 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh54\" (UniqueName: \"kubernetes.io/projected/fd89878b-aa73-4899-896b-6556c8930d7d-kube-api-access-gvh54\") pod \"mariadb-client-6-default\" (UID: \"fd89878b-aa73-4899-896b-6556c8930d7d\") " pod="openstack/mariadb-client-6-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.747161 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvh54\" (UniqueName: \"kubernetes.io/projected/fd89878b-aa73-4899-896b-6556c8930d7d-kube-api-access-gvh54\") pod \"mariadb-client-6-default\" (UID: \"fd89878b-aa73-4899-896b-6556c8930d7d\") " pod="openstack/mariadb-client-6-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.797994 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="942d9b2365afe390f2d38d29f37925f7b2de8164923aa03c69c74bf501080cba" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.798107 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 01 11:27:36 crc kubenswrapper[4958]: I1201 11:27:36.847460 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 01 11:27:37 crc kubenswrapper[4958]: I1201 11:27:37.205291 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 01 11:27:37 crc kubenswrapper[4958]: W1201 11:27:37.206162 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd89878b_aa73_4899_896b_6556c8930d7d.slice/crio-053909b7bb171199011b3202f328c8f2a835b374a994586033c40154fc4d8272 WatchSource:0}: Error finding container 053909b7bb171199011b3202f328c8f2a835b374a994586033c40154fc4d8272: Status 404 returned error can't find the container with id 053909b7bb171199011b3202f328c8f2a835b374a994586033c40154fc4d8272 Dec 01 11:27:37 crc kubenswrapper[4958]: I1201 11:27:37.815054 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f9a687-30c0-457d-8b7e-d404bae15865" path="/var/lib/kubelet/pods/82f9a687-30c0-457d-8b7e-d404bae15865/volumes" Dec 01 11:27:37 crc kubenswrapper[4958]: I1201 11:27:37.816373 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"fd89878b-aa73-4899-896b-6556c8930d7d","Type":"ContainerStarted","Data":"e730031dbfd93660790bbe7b721be5f143e63e92d9567c9ca62a92d16284f732"} Dec 01 11:27:37 crc kubenswrapper[4958]: I1201 11:27:37.816433 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"fd89878b-aa73-4899-896b-6556c8930d7d","Type":"ContainerStarted","Data":"053909b7bb171199011b3202f328c8f2a835b374a994586033c40154fc4d8272"} Dec 01 11:27:37 crc kubenswrapper[4958]: I1201 11:27:37.834762 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.834737668 podStartE2EDuration="1.834737668s" podCreationTimestamp="2025-12-01 11:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:27:37.826184755 +0000 UTC m=+5305.334973802" watchObservedRunningTime="2025-12-01 11:27:37.834737668 +0000 UTC m=+5305.343526755" Dec 01 11:27:37 crc kubenswrapper[4958]: I1201 11:27:37.892202 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_fd89878b-aa73-4899-896b-6556c8930d7d/mariadb-client-6-default/0.log" Dec 01 11:27:38 crc kubenswrapper[4958]: I1201 11:27:38.825636 4958 generic.go:334] "Generic (PLEG): container finished" podID="fd89878b-aa73-4899-896b-6556c8930d7d" containerID="e730031dbfd93660790bbe7b721be5f143e63e92d9567c9ca62a92d16284f732" exitCode=1 Dec 01 11:27:38 crc kubenswrapper[4958]: I1201 11:27:38.825788 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"fd89878b-aa73-4899-896b-6556c8930d7d","Type":"ContainerDied","Data":"e730031dbfd93660790bbe7b721be5f143e63e92d9567c9ca62a92d16284f732"} Dec 01 11:27:39 crc kubenswrapper[4958]: I1201 11:27:39.799438 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:27:39 crc kubenswrapper[4958]: E1201 11:27:39.800280 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.294881 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.329342 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.334504 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.374666 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvh54\" (UniqueName: \"kubernetes.io/projected/fd89878b-aa73-4899-896b-6556c8930d7d-kube-api-access-gvh54\") pod \"fd89878b-aa73-4899-896b-6556c8930d7d\" (UID: \"fd89878b-aa73-4899-896b-6556c8930d7d\") " Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.379503 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd89878b-aa73-4899-896b-6556c8930d7d-kube-api-access-gvh54" (OuterVolumeSpecName: "kube-api-access-gvh54") pod "fd89878b-aa73-4899-896b-6556c8930d7d" (UID: "fd89878b-aa73-4899-896b-6556c8930d7d"). InnerVolumeSpecName "kube-api-access-gvh54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.475895 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvh54\" (UniqueName: \"kubernetes.io/projected/fd89878b-aa73-4899-896b-6556c8930d7d-kube-api-access-gvh54\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.477556 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 01 11:27:40 crc kubenswrapper[4958]: E1201 11:27:40.478106 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd89878b-aa73-4899-896b-6556c8930d7d" containerName="mariadb-client-6-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.478132 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd89878b-aa73-4899-896b-6556c8930d7d" containerName="mariadb-client-6-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.478314 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd89878b-aa73-4899-896b-6556c8930d7d" containerName="mariadb-client-6-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.479008 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.486836 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.579917 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv965\" (UniqueName: \"kubernetes.io/projected/23d80c8a-c502-4de0-9b1e-b4b034aee404-kube-api-access-dv965\") pod \"mariadb-client-7-default\" (UID: \"23d80c8a-c502-4de0-9b1e-b4b034aee404\") " pod="openstack/mariadb-client-7-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.681311 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv965\" (UniqueName: \"kubernetes.io/projected/23d80c8a-c502-4de0-9b1e-b4b034aee404-kube-api-access-dv965\") pod \"mariadb-client-7-default\" (UID: \"23d80c8a-c502-4de0-9b1e-b4b034aee404\") " pod="openstack/mariadb-client-7-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.711235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv965\" (UniqueName: \"kubernetes.io/projected/23d80c8a-c502-4de0-9b1e-b4b034aee404-kube-api-access-dv965\") pod \"mariadb-client-7-default\" (UID: \"23d80c8a-c502-4de0-9b1e-b4b034aee404\") " pod="openstack/mariadb-client-7-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.806176 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.849864 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053909b7bb171199011b3202f328c8f2a835b374a994586033c40154fc4d8272" Dec 01 11:27:40 crc kubenswrapper[4958]: I1201 11:27:40.849917 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 01 11:27:41 crc kubenswrapper[4958]: I1201 11:27:41.204286 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 01 11:27:41 crc kubenswrapper[4958]: W1201 11:27:41.210085 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23d80c8a_c502_4de0_9b1e_b4b034aee404.slice/crio-17d5375105e9d9ff712252a6ad872987623e1072267e915b01c79de057ffe430 WatchSource:0}: Error finding container 17d5375105e9d9ff712252a6ad872987623e1072267e915b01c79de057ffe430: Status 404 returned error can't find the container with id 17d5375105e9d9ff712252a6ad872987623e1072267e915b01c79de057ffe430 Dec 01 11:27:41 crc kubenswrapper[4958]: I1201 11:27:41.816591 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd89878b-aa73-4899-896b-6556c8930d7d" path="/var/lib/kubelet/pods/fd89878b-aa73-4899-896b-6556c8930d7d/volumes" Dec 01 11:27:41 crc kubenswrapper[4958]: I1201 11:27:41.865311 4958 generic.go:334] "Generic (PLEG): container finished" podID="23d80c8a-c502-4de0-9b1e-b4b034aee404" containerID="b681aae2ef26c42ab99e0f3e3147d260a09a85f2efdad5b2ccb80ec6ac517faa" exitCode=0 Dec 01 11:27:41 crc kubenswrapper[4958]: I1201 11:27:41.865390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"23d80c8a-c502-4de0-9b1e-b4b034aee404","Type":"ContainerDied","Data":"b681aae2ef26c42ab99e0f3e3147d260a09a85f2efdad5b2ccb80ec6ac517faa"} Dec 01 11:27:41 crc kubenswrapper[4958]: I1201 11:27:41.865444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"23d80c8a-c502-4de0-9b1e-b4b034aee404","Type":"ContainerStarted","Data":"17d5375105e9d9ff712252a6ad872987623e1072267e915b01c79de057ffe430"} Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.252429 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.269644 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_23d80c8a-c502-4de0-9b1e-b4b034aee404/mariadb-client-7-default/0.log" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.309884 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.324186 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.429480 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv965\" (UniqueName: \"kubernetes.io/projected/23d80c8a-c502-4de0-9b1e-b4b034aee404-kube-api-access-dv965\") pod \"23d80c8a-c502-4de0-9b1e-b4b034aee404\" (UID: \"23d80c8a-c502-4de0-9b1e-b4b034aee404\") " Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.434737 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d80c8a-c502-4de0-9b1e-b4b034aee404-kube-api-access-dv965" (OuterVolumeSpecName: "kube-api-access-dv965") pod "23d80c8a-c502-4de0-9b1e-b4b034aee404" (UID: "23d80c8a-c502-4de0-9b1e-b4b034aee404"). InnerVolumeSpecName "kube-api-access-dv965". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.476502 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 01 11:27:43 crc kubenswrapper[4958]: E1201 11:27:43.476812 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d80c8a-c502-4de0-9b1e-b4b034aee404" containerName="mariadb-client-7-default" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.476828 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d80c8a-c502-4de0-9b1e-b4b034aee404" containerName="mariadb-client-7-default" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.477034 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d80c8a-c502-4de0-9b1e-b4b034aee404" containerName="mariadb-client-7-default" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.477624 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.487964 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.531361 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv965\" (UniqueName: \"kubernetes.io/projected/23d80c8a-c502-4de0-9b1e-b4b034aee404-kube-api-access-dv965\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.632810 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2p9q\" (UniqueName: \"kubernetes.io/projected/8882ce29-28ab-41c2-9a9f-f34336f65b45-kube-api-access-d2p9q\") pod \"mariadb-client-2\" (UID: \"8882ce29-28ab-41c2-9a9f-f34336f65b45\") " pod="openstack/mariadb-client-2" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.734185 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2p9q\" (UniqueName: \"kubernetes.io/projected/8882ce29-28ab-41c2-9a9f-f34336f65b45-kube-api-access-d2p9q\") pod \"mariadb-client-2\" (UID: \"8882ce29-28ab-41c2-9a9f-f34336f65b45\") " pod="openstack/mariadb-client-2" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.752665 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2p9q\" (UniqueName: \"kubernetes.io/projected/8882ce29-28ab-41c2-9a9f-f34336f65b45-kube-api-access-d2p9q\") pod \"mariadb-client-2\" (UID: \"8882ce29-28ab-41c2-9a9f-f34336f65b45\") " pod="openstack/mariadb-client-2" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.811059 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d80c8a-c502-4de0-9b1e-b4b034aee404" path="/var/lib/kubelet/pods/23d80c8a-c502-4de0-9b1e-b4b034aee404/volumes" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.814405 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.922623 4958 scope.go:117] "RemoveContainer" containerID="b681aae2ef26c42ab99e0f3e3147d260a09a85f2efdad5b2ccb80ec6ac517faa" Dec 01 11:27:43 crc kubenswrapper[4958]: I1201 11:27:43.922721 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 01 11:27:44 crc kubenswrapper[4958]: I1201 11:27:44.399762 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 01 11:27:44 crc kubenswrapper[4958]: W1201 11:27:44.404625 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8882ce29_28ab_41c2_9a9f_f34336f65b45.slice/crio-bef5db42e1b29211ff532a462a873fe5432922b671e225aa1ba1659a08f56fc2 WatchSource:0}: Error finding container bef5db42e1b29211ff532a462a873fe5432922b671e225aa1ba1659a08f56fc2: Status 404 returned error can't find the container with id bef5db42e1b29211ff532a462a873fe5432922b671e225aa1ba1659a08f56fc2 Dec 01 11:27:44 crc kubenswrapper[4958]: I1201 11:27:44.936303 4958 generic.go:334] "Generic (PLEG): container finished" podID="8882ce29-28ab-41c2-9a9f-f34336f65b45" containerID="567c43ead8823c632514df8ad47fe4240adb6595cf8348f208335e3f5a183de8" exitCode=0 Dec 01 11:27:44 crc kubenswrapper[4958]: I1201 11:27:44.936416 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"8882ce29-28ab-41c2-9a9f-f34336f65b45","Type":"ContainerDied","Data":"567c43ead8823c632514df8ad47fe4240adb6595cf8348f208335e3f5a183de8"} Dec 01 11:27:44 crc kubenswrapper[4958]: I1201 11:27:44.936596 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"8882ce29-28ab-41c2-9a9f-f34336f65b45","Type":"ContainerStarted","Data":"bef5db42e1b29211ff532a462a873fe5432922b671e225aa1ba1659a08f56fc2"} Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.366443 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.381775 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2p9q\" (UniqueName: \"kubernetes.io/projected/8882ce29-28ab-41c2-9a9f-f34336f65b45-kube-api-access-d2p9q\") pod \"8882ce29-28ab-41c2-9a9f-f34336f65b45\" (UID: \"8882ce29-28ab-41c2-9a9f-f34336f65b45\") " Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.387530 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_8882ce29-28ab-41c2-9a9f-f34336f65b45/mariadb-client-2/0.log" Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.388434 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8882ce29-28ab-41c2-9a9f-f34336f65b45-kube-api-access-d2p9q" (OuterVolumeSpecName: "kube-api-access-d2p9q") pod "8882ce29-28ab-41c2-9a9f-f34336f65b45" (UID: "8882ce29-28ab-41c2-9a9f-f34336f65b45"). InnerVolumeSpecName "kube-api-access-d2p9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.419096 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.424495 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.483320 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2p9q\" (UniqueName: \"kubernetes.io/projected/8882ce29-28ab-41c2-9a9f-f34336f65b45-kube-api-access-d2p9q\") on node \"crc\" DevicePath \"\"" Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.959568 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bef5db42e1b29211ff532a462a873fe5432922b671e225aa1ba1659a08f56fc2" Dec 01 11:27:46 crc kubenswrapper[4958]: I1201 11:27:46.959666 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 01 11:27:47 crc kubenswrapper[4958]: I1201 11:27:47.814825 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8882ce29-28ab-41c2-9a9f-f34336f65b45" path="/var/lib/kubelet/pods/8882ce29-28ab-41c2-9a9f-f34336f65b45/volumes" Dec 01 11:27:54 crc kubenswrapper[4958]: I1201 11:27:54.798338 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:27:54 crc kubenswrapper[4958]: E1201 11:27:54.799507 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:28:06 crc kubenswrapper[4958]: I1201 11:28:06.808426 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:28:07 crc kubenswrapper[4958]: I1201 11:28:07.245695 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"991fdb4be35530b97fb80defdced702e6d810a5e7b47e601ff99ede2a4cc30ae"} Dec 01 11:28:22 crc kubenswrapper[4958]: I1201 11:28:22.919436 4958 scope.go:117] "RemoveContainer" containerID="5b35fcff2b22cb1f7ad58cfd6cedf9ed7735ac640d145cb976d5e5ff265e4c82" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.648222 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mjlt4"] Dec 01 11:29:36 crc kubenswrapper[4958]: E1201 11:29:36.650527 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8882ce29-28ab-41c2-9a9f-f34336f65b45" containerName="mariadb-client-2" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.650556 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8882ce29-28ab-41c2-9a9f-f34336f65b45" containerName="mariadb-client-2" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.650896 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8882ce29-28ab-41c2-9a9f-f34336f65b45" containerName="mariadb-client-2" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.653069 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.660672 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjlt4"] Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.710179 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-catalog-content\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.710345 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjl4\" (UniqueName: \"kubernetes.io/projected/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-kube-api-access-mbjl4\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.710437 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-utilities\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.811527 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-catalog-content\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.811677 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjl4\" (UniqueName: \"kubernetes.io/projected/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-kube-api-access-mbjl4\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.811719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-utilities\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.812305 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-utilities\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.812643 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-catalog-content\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.837716 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjl4\" (UniqueName: \"kubernetes.io/projected/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-kube-api-access-mbjl4\") pod \"community-operators-mjlt4\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:36 crc kubenswrapper[4958]: I1201 11:29:36.979308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:37 crc kubenswrapper[4958]: I1201 11:29:37.289127 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjlt4"] Dec 01 11:29:38 crc kubenswrapper[4958]: I1201 11:29:38.223354 4958 generic.go:334] "Generic (PLEG): container finished" podID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerID="45f60775e9f4e4a69a5b58a91069da13088e15f986c5c6e851a9a798dbf73f5b" exitCode=0 Dec 01 11:29:38 crc kubenswrapper[4958]: I1201 11:29:38.223397 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlt4" event={"ID":"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd","Type":"ContainerDied","Data":"45f60775e9f4e4a69a5b58a91069da13088e15f986c5c6e851a9a798dbf73f5b"} Dec 01 11:29:38 crc kubenswrapper[4958]: I1201 11:29:38.224358 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlt4" event={"ID":"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd","Type":"ContainerStarted","Data":"6a9d1c5d6a1a82a79e1140850eec4a417d24e15e6c558eabfa8e1a2a9e14c9b5"} Dec 01 11:29:40 crc kubenswrapper[4958]: I1201 11:29:40.246230 4958 generic.go:334] "Generic (PLEG): container finished" podID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerID="78c8b263e7a54997e27f9d88ccd9850a8b8660e97605b501272f50edcc189920" exitCode=0 Dec 01 11:29:40 crc kubenswrapper[4958]: I1201 11:29:40.246328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlt4" event={"ID":"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd","Type":"ContainerDied","Data":"78c8b263e7a54997e27f9d88ccd9850a8b8660e97605b501272f50edcc189920"} Dec 01 11:29:41 crc kubenswrapper[4958]: I1201 11:29:41.270169 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlt4" event={"ID":"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd","Type":"ContainerStarted","Data":"04e644279bed74d3993a917be6e8a4e2c4414810d16e8a6a4a57adfe3e078cd8"} Dec 01 11:29:41 crc kubenswrapper[4958]: I1201 11:29:41.298604 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mjlt4" podStartSLOduration=2.833740378 podStartE2EDuration="5.298585537s" podCreationTimestamp="2025-12-01 11:29:36 +0000 UTC" firstStartedPulling="2025-12-01 11:29:38.226139386 +0000 UTC m=+5425.734928463" lastFinishedPulling="2025-12-01 11:29:40.690984565 +0000 UTC m=+5428.199773622" observedRunningTime="2025-12-01 11:29:41.290134368 +0000 UTC m=+5428.798923425" watchObservedRunningTime="2025-12-01 11:29:41.298585537 +0000 UTC m=+5428.807374574" Dec 01 11:29:46 crc kubenswrapper[4958]: I1201 11:29:46.979712 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:46 crc kubenswrapper[4958]: I1201 11:29:46.980507 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:47 crc kubenswrapper[4958]: I1201 11:29:47.056058 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:47 crc kubenswrapper[4958]: I1201 11:29:47.377873 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:47 crc kubenswrapper[4958]: I1201 11:29:47.437750 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjlt4"] Dec 01 11:29:49 crc kubenswrapper[4958]: I1201 11:29:49.346666 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mjlt4" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="registry-server" containerID="cri-o://04e644279bed74d3993a917be6e8a4e2c4414810d16e8a6a4a57adfe3e078cd8" gracePeriod=2 Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.368563 4958 generic.go:334] "Generic (PLEG): container finished" podID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerID="04e644279bed74d3993a917be6e8a4e2c4414810d16e8a6a4a57adfe3e078cd8" exitCode=0 Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.368898 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlt4" event={"ID":"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd","Type":"ContainerDied","Data":"04e644279bed74d3993a917be6e8a4e2c4414810d16e8a6a4a57adfe3e078cd8"} Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.499168 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.660988 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-catalog-content\") pod \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.661065 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-utilities\") pod \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.661336 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbjl4\" (UniqueName: \"kubernetes.io/projected/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-kube-api-access-mbjl4\") pod \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\" (UID: \"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd\") " Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.662524 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-utilities" (OuterVolumeSpecName: "utilities") pod "55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" (UID: "55cbf1c9-3f29-4e78-9f0f-21af30ff21fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.670406 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-kube-api-access-mbjl4" (OuterVolumeSpecName: "kube-api-access-mbjl4") pod "55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" (UID: "55cbf1c9-3f29-4e78-9f0f-21af30ff21fd"). InnerVolumeSpecName "kube-api-access-mbjl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.761307 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" (UID: "55cbf1c9-3f29-4e78-9f0f-21af30ff21fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.764209 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbjl4\" (UniqueName: \"kubernetes.io/projected/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-kube-api-access-mbjl4\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.764282 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:50 crc kubenswrapper[4958]: I1201 11:29:50.764312 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.379422 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlt4" event={"ID":"55cbf1c9-3f29-4e78-9f0f-21af30ff21fd","Type":"ContainerDied","Data":"6a9d1c5d6a1a82a79e1140850eec4a417d24e15e6c558eabfa8e1a2a9e14c9b5"} Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.379879 4958 scope.go:117] "RemoveContainer" containerID="04e644279bed74d3993a917be6e8a4e2c4414810d16e8a6a4a57adfe3e078cd8" Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.379648 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlt4" Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.416551 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjlt4"] Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.419174 4958 scope.go:117] "RemoveContainer" containerID="78c8b263e7a54997e27f9d88ccd9850a8b8660e97605b501272f50edcc189920" Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.424912 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mjlt4"] Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.456259 4958 scope.go:117] "RemoveContainer" containerID="45f60775e9f4e4a69a5b58a91069da13088e15f986c5c6e851a9a798dbf73f5b" Dec 01 11:29:51 crc kubenswrapper[4958]: I1201 11:29:51.815023 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" path="/var/lib/kubelet/pods/55cbf1c9-3f29-4e78-9f0f-21af30ff21fd/volumes" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.169180 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf"] Dec 01 11:30:00 crc kubenswrapper[4958]: E1201 11:30:00.170183 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="extract-content" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.170200 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="extract-content" Dec 01 11:30:00 crc kubenswrapper[4958]: E1201 11:30:00.170217 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="registry-server" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.170222 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="registry-server" Dec 01 11:30:00 crc kubenswrapper[4958]: E1201 11:30:00.170233 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="extract-utilities" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.170240 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="extract-utilities" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.170399 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cbf1c9-3f29-4e78-9f0f-21af30ff21fd" containerName="registry-server" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.171136 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.173881 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.175044 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.180489 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf"] Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.247175 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-secret-volume\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.247356 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-config-volume\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.247406 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkl2q\" (UniqueName: \"kubernetes.io/projected/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-kube-api-access-mkl2q\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.349277 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-config-volume\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.349375 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkl2q\" (UniqueName: \"kubernetes.io/projected/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-kube-api-access-mkl2q\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.349508 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-secret-volume\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.350160 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-config-volume\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.357480 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-secret-volume\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.412971 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkl2q\" (UniqueName: \"kubernetes.io/projected/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-kube-api-access-mkl2q\") pod \"collect-profiles-29409810-l7nsf\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.497751 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:00 crc kubenswrapper[4958]: I1201 11:30:00.726447 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf"] Dec 01 11:30:01 crc kubenswrapper[4958]: I1201 11:30:01.481000 4958 generic.go:334] "Generic (PLEG): container finished" podID="19ca6d04-613b-4a76-bcfd-bf3e86d635ce" containerID="32e8cb3d01a8b2bec2d396a42c7b2f3f6c8db35b895ce9ad838796bcc321a8f9" exitCode=0 Dec 01 11:30:01 crc kubenswrapper[4958]: I1201 11:30:01.481077 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" event={"ID":"19ca6d04-613b-4a76-bcfd-bf3e86d635ce","Type":"ContainerDied","Data":"32e8cb3d01a8b2bec2d396a42c7b2f3f6c8db35b895ce9ad838796bcc321a8f9"} Dec 01 11:30:01 crc kubenswrapper[4958]: I1201 11:30:01.481358 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" event={"ID":"19ca6d04-613b-4a76-bcfd-bf3e86d635ce","Type":"ContainerStarted","Data":"2b21058e7ec80f794da5e8d137780482ce7b05ac37f66797b9ee14b0de376a26"} Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.823002 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.921487 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-config-volume\") pod \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.921560 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkl2q\" (UniqueName: \"kubernetes.io/projected/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-kube-api-access-mkl2q\") pod \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.921591 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-secret-volume\") pod \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\" (UID: \"19ca6d04-613b-4a76-bcfd-bf3e86d635ce\") " Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.922421 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-config-volume" (OuterVolumeSpecName: "config-volume") pod "19ca6d04-613b-4a76-bcfd-bf3e86d635ce" (UID: "19ca6d04-613b-4a76-bcfd-bf3e86d635ce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.927930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-kube-api-access-mkl2q" (OuterVolumeSpecName: "kube-api-access-mkl2q") pod "19ca6d04-613b-4a76-bcfd-bf3e86d635ce" (UID: "19ca6d04-613b-4a76-bcfd-bf3e86d635ce"). InnerVolumeSpecName "kube-api-access-mkl2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:30:02 crc kubenswrapper[4958]: I1201 11:30:02.927968 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "19ca6d04-613b-4a76-bcfd-bf3e86d635ce" (UID: "19ca6d04-613b-4a76-bcfd-bf3e86d635ce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.023114 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.023573 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkl2q\" (UniqueName: \"kubernetes.io/projected/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-kube-api-access-mkl2q\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.023588 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19ca6d04-613b-4a76-bcfd-bf3e86d635ce-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.509269 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" event={"ID":"19ca6d04-613b-4a76-bcfd-bf3e86d635ce","Type":"ContainerDied","Data":"2b21058e7ec80f794da5e8d137780482ce7b05ac37f66797b9ee14b0de376a26"} Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.509355 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b21058e7ec80f794da5e8d137780482ce7b05ac37f66797b9ee14b0de376a26" Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.509353 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf" Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.921366 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455"] Dec 01 11:30:03 crc kubenswrapper[4958]: I1201 11:30:03.932114 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-pr455"] Dec 01 11:30:05 crc kubenswrapper[4958]: I1201 11:30:05.810353 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c13797-0e6d-44c2-a6fb-4bfb19907946" path="/var/lib/kubelet/pods/25c13797-0e6d-44c2-a6fb-4bfb19907946/volumes" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.487945 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gsjdj"] Dec 01 11:30:15 crc kubenswrapper[4958]: E1201 11:30:15.489097 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ca6d04-613b-4a76-bcfd-bf3e86d635ce" containerName="collect-profiles" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.489122 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ca6d04-613b-4a76-bcfd-bf3e86d635ce" containerName="collect-profiles" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.489471 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ca6d04-613b-4a76-bcfd-bf3e86d635ce" containerName="collect-profiles" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.491490 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.509664 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-catalog-content\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.509747 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-utilities\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.509981 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzf9d\" (UniqueName: \"kubernetes.io/projected/cfa33c81-f29f-437d-ae19-86778dbb2361-kube-api-access-zzf9d\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.510299 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsjdj"] Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.611421 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-catalog-content\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.611569 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-utilities\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.611633 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzf9d\" (UniqueName: \"kubernetes.io/projected/cfa33c81-f29f-437d-ae19-86778dbb2361-kube-api-access-zzf9d\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.612122 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-catalog-content\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.612985 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-utilities\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.643654 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzf9d\" (UniqueName: \"kubernetes.io/projected/cfa33c81-f29f-437d-ae19-86778dbb2361-kube-api-access-zzf9d\") pod \"certified-operators-gsjdj\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:15 crc kubenswrapper[4958]: I1201 11:30:15.815193 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:16 crc kubenswrapper[4958]: I1201 11:30:16.316478 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gsjdj"] Dec 01 11:30:16 crc kubenswrapper[4958]: W1201 11:30:16.320670 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa33c81_f29f_437d_ae19_86778dbb2361.slice/crio-fb105e1c64f66c5ef2b19dc99cfdcd2d9df67c4ef2735e64daccb842b91fbaa9 WatchSource:0}: Error finding container fb105e1c64f66c5ef2b19dc99cfdcd2d9df67c4ef2735e64daccb842b91fbaa9: Status 404 returned error can't find the container with id fb105e1c64f66c5ef2b19dc99cfdcd2d9df67c4ef2735e64daccb842b91fbaa9 Dec 01 11:30:16 crc kubenswrapper[4958]: I1201 11:30:16.651138 4958 generic.go:334] "Generic (PLEG): container finished" podID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerID="53f8aa365648b383d95bb5d793f74e0e4c6238ca42016109bf2f0375c37686dc" exitCode=0 Dec 01 11:30:16 crc kubenswrapper[4958]: I1201 11:30:16.651194 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsjdj" event={"ID":"cfa33c81-f29f-437d-ae19-86778dbb2361","Type":"ContainerDied","Data":"53f8aa365648b383d95bb5d793f74e0e4c6238ca42016109bf2f0375c37686dc"} Dec 01 11:30:16 crc kubenswrapper[4958]: I1201 11:30:16.651223 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsjdj" event={"ID":"cfa33c81-f29f-437d-ae19-86778dbb2361","Type":"ContainerStarted","Data":"fb105e1c64f66c5ef2b19dc99cfdcd2d9df67c4ef2735e64daccb842b91fbaa9"} Dec 01 11:30:18 crc kubenswrapper[4958]: I1201 11:30:18.713419 4958 generic.go:334] "Generic (PLEG): container finished" podID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerID="35498c3fa0728a0696dd24b35bb223ee283caac4593dd2b3e2050e26a0f0b12f" exitCode=0 Dec 01 11:30:18 crc kubenswrapper[4958]: I1201 11:30:18.714009 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsjdj" event={"ID":"cfa33c81-f29f-437d-ae19-86778dbb2361","Type":"ContainerDied","Data":"35498c3fa0728a0696dd24b35bb223ee283caac4593dd2b3e2050e26a0f0b12f"} Dec 01 11:30:19 crc kubenswrapper[4958]: I1201 11:30:19.725343 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsjdj" event={"ID":"cfa33c81-f29f-437d-ae19-86778dbb2361","Type":"ContainerStarted","Data":"5d3bc4ef00cc2232277956b43374e2ec08c740465a61fc2504f863b950f31b35"} Dec 01 11:30:19 crc kubenswrapper[4958]: I1201 11:30:19.749380 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gsjdj" podStartSLOduration=1.952408708 podStartE2EDuration="4.749357576s" podCreationTimestamp="2025-12-01 11:30:15 +0000 UTC" firstStartedPulling="2025-12-01 11:30:16.652969545 +0000 UTC m=+5464.161758582" lastFinishedPulling="2025-12-01 11:30:19.449918403 +0000 UTC m=+5466.958707450" observedRunningTime="2025-12-01 11:30:19.740916427 +0000 UTC m=+5467.249705464" watchObservedRunningTime="2025-12-01 11:30:19.749357576 +0000 UTC m=+5467.258146623" Dec 01 11:30:23 crc kubenswrapper[4958]: I1201 11:30:23.025408 4958 scope.go:117] "RemoveContainer" containerID="1525ca706432f1241fa5f8f53918d4474da847aaf4cf78e681433c6dfd421bcf" Dec 01 11:30:25 crc kubenswrapper[4958]: I1201 11:30:25.818412 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:25 crc kubenswrapper[4958]: I1201 11:30:25.819055 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:25 crc kubenswrapper[4958]: I1201 11:30:25.864769 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:26 crc kubenswrapper[4958]: I1201 11:30:26.850892 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:28 crc kubenswrapper[4958]: I1201 11:30:28.210238 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:30:28 crc kubenswrapper[4958]: I1201 11:30:28.210339 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:30:30 crc kubenswrapper[4958]: I1201 11:30:30.507388 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsjdj"] Dec 01 11:30:30 crc kubenswrapper[4958]: I1201 11:30:30.507805 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gsjdj" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="registry-server" containerID="cri-o://5d3bc4ef00cc2232277956b43374e2ec08c740465a61fc2504f863b950f31b35" gracePeriod=2 Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.453258 4958 generic.go:334] "Generic (PLEG): container finished" podID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerID="5d3bc4ef00cc2232277956b43374e2ec08c740465a61fc2504f863b950f31b35" exitCode=0 Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.453357 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsjdj" event={"ID":"cfa33c81-f29f-437d-ae19-86778dbb2361","Type":"ContainerDied","Data":"5d3bc4ef00cc2232277956b43374e2ec08c740465a61fc2504f863b950f31b35"} Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.552118 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.648119 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-catalog-content\") pod \"cfa33c81-f29f-437d-ae19-86778dbb2361\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.648271 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-utilities\") pod \"cfa33c81-f29f-437d-ae19-86778dbb2361\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.648303 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzf9d\" (UniqueName: \"kubernetes.io/projected/cfa33c81-f29f-437d-ae19-86778dbb2361-kube-api-access-zzf9d\") pod \"cfa33c81-f29f-437d-ae19-86778dbb2361\" (UID: \"cfa33c81-f29f-437d-ae19-86778dbb2361\") " Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.649400 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-utilities" (OuterVolumeSpecName: "utilities") pod "cfa33c81-f29f-437d-ae19-86778dbb2361" (UID: "cfa33c81-f29f-437d-ae19-86778dbb2361"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.661126 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa33c81-f29f-437d-ae19-86778dbb2361-kube-api-access-zzf9d" (OuterVolumeSpecName: "kube-api-access-zzf9d") pod "cfa33c81-f29f-437d-ae19-86778dbb2361" (UID: "cfa33c81-f29f-437d-ae19-86778dbb2361"). InnerVolumeSpecName "kube-api-access-zzf9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.711904 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfa33c81-f29f-437d-ae19-86778dbb2361" (UID: "cfa33c81-f29f-437d-ae19-86778dbb2361"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.750792 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.750878 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa33c81-f29f-437d-ae19-86778dbb2361-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:31 crc kubenswrapper[4958]: I1201 11:30:31.750893 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzf9d\" (UniqueName: \"kubernetes.io/projected/cfa33c81-f29f-437d-ae19-86778dbb2361-kube-api-access-zzf9d\") on node \"crc\" DevicePath \"\"" Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.467119 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gsjdj" event={"ID":"cfa33c81-f29f-437d-ae19-86778dbb2361","Type":"ContainerDied","Data":"fb105e1c64f66c5ef2b19dc99cfdcd2d9df67c4ef2735e64daccb842b91fbaa9"} Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.467204 4958 scope.go:117] "RemoveContainer" containerID="5d3bc4ef00cc2232277956b43374e2ec08c740465a61fc2504f863b950f31b35" Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.467216 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gsjdj" Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.497088 4958 scope.go:117] "RemoveContainer" containerID="35498c3fa0728a0696dd24b35bb223ee283caac4593dd2b3e2050e26a0f0b12f" Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.498831 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gsjdj"] Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.504786 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gsjdj"] Dec 01 11:30:32 crc kubenswrapper[4958]: I1201 11:30:32.527437 4958 scope.go:117] "RemoveContainer" containerID="53f8aa365648b383d95bb5d793f74e0e4c6238ca42016109bf2f0375c37686dc" Dec 01 11:30:33 crc kubenswrapper[4958]: I1201 11:30:33.813881 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" path="/var/lib/kubelet/pods/cfa33c81-f29f-437d-ae19-86778dbb2361/volumes" Dec 01 11:30:58 crc kubenswrapper[4958]: I1201 11:30:58.210632 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:30:58 crc kubenswrapper[4958]: I1201 11:30:58.211564 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:31:28 crc kubenswrapper[4958]: I1201 11:31:28.210805 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:31:28 crc kubenswrapper[4958]: I1201 11:31:28.211438 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:31:28 crc kubenswrapper[4958]: I1201 11:31:28.211523 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:31:28 crc kubenswrapper[4958]: I1201 11:31:28.212696 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"991fdb4be35530b97fb80defdced702e6d810a5e7b47e601ff99ede2a4cc30ae"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:31:28 crc kubenswrapper[4958]: I1201 11:31:28.212806 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://991fdb4be35530b97fb80defdced702e6d810a5e7b47e601ff99ede2a4cc30ae" gracePeriod=600 Dec 01 11:31:29 crc kubenswrapper[4958]: I1201 11:31:29.034399 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="991fdb4be35530b97fb80defdced702e6d810a5e7b47e601ff99ede2a4cc30ae" exitCode=0 Dec 01 11:31:29 crc kubenswrapper[4958]: I1201 11:31:29.034431 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"991fdb4be35530b97fb80defdced702e6d810a5e7b47e601ff99ede2a4cc30ae"} Dec 01 11:31:29 crc kubenswrapper[4958]: I1201 11:31:29.034673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc"} Dec 01 11:31:29 crc kubenswrapper[4958]: I1201 11:31:29.034696 4958 scope.go:117] "RemoveContainer" containerID="374f3b903e6468b6edbc6d8b7f48aac24289d7860d2810038a3a2558262b781d" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.847989 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sqn68"] Dec 01 11:31:53 crc kubenswrapper[4958]: E1201 11:31:53.849493 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="registry-server" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.849527 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="registry-server" Dec 01 11:31:53 crc kubenswrapper[4958]: E1201 11:31:53.849551 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="extract-utilities" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.849569 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="extract-utilities" Dec 01 11:31:53 crc kubenswrapper[4958]: E1201 11:31:53.849611 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="extract-content" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.849631 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="extract-content" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.850194 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa33c81-f29f-437d-ae19-86778dbb2361" containerName="registry-server" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.853177 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.882101 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-utilities\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.882355 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mkhd\" (UniqueName: \"kubernetes.io/projected/33bef286-833f-41a6-9770-eec44ebfabd5-kube-api-access-9mkhd\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.882481 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-catalog-content\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.883399 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqn68"] Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.995673 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mkhd\" (UniqueName: \"kubernetes.io/projected/33bef286-833f-41a6-9770-eec44ebfabd5-kube-api-access-9mkhd\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.995771 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-catalog-content\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.995918 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-utilities\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.996792 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-catalog-content\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:53 crc kubenswrapper[4958]: I1201 11:31:53.997123 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-utilities\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:54 crc kubenswrapper[4958]: I1201 11:31:54.024495 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mkhd\" (UniqueName: \"kubernetes.io/projected/33bef286-833f-41a6-9770-eec44ebfabd5-kube-api-access-9mkhd\") pod \"redhat-marketplace-sqn68\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:54 crc kubenswrapper[4958]: I1201 11:31:54.241838 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:31:54 crc kubenswrapper[4958]: I1201 11:31:54.730652 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqn68"] Dec 01 11:31:55 crc kubenswrapper[4958]: I1201 11:31:55.343707 4958 generic.go:334] "Generic (PLEG): container finished" podID="33bef286-833f-41a6-9770-eec44ebfabd5" containerID="2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4" exitCode=0 Dec 01 11:31:55 crc kubenswrapper[4958]: I1201 11:31:55.343770 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqn68" event={"ID":"33bef286-833f-41a6-9770-eec44ebfabd5","Type":"ContainerDied","Data":"2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4"} Dec 01 11:31:55 crc kubenswrapper[4958]: I1201 11:31:55.344007 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqn68" event={"ID":"33bef286-833f-41a6-9770-eec44ebfabd5","Type":"ContainerStarted","Data":"6ad29136cbc0da8468518f337861ef4c102d29cdf14045469a359efc4f5aa58e"} Dec 01 11:31:56 crc kubenswrapper[4958]: I1201 11:31:56.358347 4958 generic.go:334] "Generic (PLEG): container finished" podID="33bef286-833f-41a6-9770-eec44ebfabd5" containerID="c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a" exitCode=0 Dec 01 11:31:56 crc kubenswrapper[4958]: I1201 11:31:56.358463 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqn68" event={"ID":"33bef286-833f-41a6-9770-eec44ebfabd5","Type":"ContainerDied","Data":"c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a"} Dec 01 11:31:57 crc kubenswrapper[4958]: I1201 11:31:57.369942 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqn68" event={"ID":"33bef286-833f-41a6-9770-eec44ebfabd5","Type":"ContainerStarted","Data":"f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431"} Dec 01 11:31:57 crc kubenswrapper[4958]: I1201 11:31:57.390896 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sqn68" podStartSLOduration=2.898325605 podStartE2EDuration="4.390826866s" podCreationTimestamp="2025-12-01 11:31:53 +0000 UTC" firstStartedPulling="2025-12-01 11:31:55.345558515 +0000 UTC m=+5562.854347562" lastFinishedPulling="2025-12-01 11:31:56.838059786 +0000 UTC m=+5564.346848823" observedRunningTime="2025-12-01 11:31:57.38707427 +0000 UTC m=+5564.895863317" watchObservedRunningTime="2025-12-01 11:31:57.390826866 +0000 UTC m=+5564.899615913" Dec 01 11:32:04 crc kubenswrapper[4958]: I1201 11:32:04.242096 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:32:04 crc kubenswrapper[4958]: I1201 11:32:04.242646 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:32:04 crc kubenswrapper[4958]: I1201 11:32:04.315694 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:32:04 crc kubenswrapper[4958]: I1201 11:32:04.535160 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:32:04 crc kubenswrapper[4958]: I1201 11:32:04.586339 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqn68"] Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.482024 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sqn68" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="registry-server" containerID="cri-o://f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431" gracePeriod=2 Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.519257 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.521012 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.526232 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r6bqk" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.532281 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.665957 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2cwz\" (UniqueName: \"kubernetes.io/projected/ef1fedf1-c218-452e-83b7-f8eb03827765-kube-api-access-c2cwz\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.666574 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.768049 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2cwz\" (UniqueName: \"kubernetes.io/projected/ef1fedf1-c218-452e-83b7-f8eb03827765-kube-api-access-c2cwz\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.768266 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.773648 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.773723 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/75ebfd9eeb18edc64746bd206c0b20fb47c0970e2e16c02918987f7bc070a02c/globalmount\"" pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.812679 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2cwz\" (UniqueName: \"kubernetes.io/projected/ef1fedf1-c218-452e-83b7-f8eb03827765-kube-api-access-c2cwz\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " pod="openstack/mariadb-copy-data" Dec 01 11:32:06 crc kubenswrapper[4958]: I1201 11:32:06.840450 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") pod \"mariadb-copy-data\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " pod="openstack/mariadb-copy-data" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.003630 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.137711 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.179590 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-catalog-content\") pod \"33bef286-833f-41a6-9770-eec44ebfabd5\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.179674 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mkhd\" (UniqueName: \"kubernetes.io/projected/33bef286-833f-41a6-9770-eec44ebfabd5-kube-api-access-9mkhd\") pod \"33bef286-833f-41a6-9770-eec44ebfabd5\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.179705 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-utilities\") pod \"33bef286-833f-41a6-9770-eec44ebfabd5\" (UID: \"33bef286-833f-41a6-9770-eec44ebfabd5\") " Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.181368 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-utilities" (OuterVolumeSpecName: "utilities") pod "33bef286-833f-41a6-9770-eec44ebfabd5" (UID: "33bef286-833f-41a6-9770-eec44ebfabd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.186448 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bef286-833f-41a6-9770-eec44ebfabd5-kube-api-access-9mkhd" (OuterVolumeSpecName: "kube-api-access-9mkhd") pod "33bef286-833f-41a6-9770-eec44ebfabd5" (UID: "33bef286-833f-41a6-9770-eec44ebfabd5"). InnerVolumeSpecName "kube-api-access-9mkhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.202396 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33bef286-833f-41a6-9770-eec44ebfabd5" (UID: "33bef286-833f-41a6-9770-eec44ebfabd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.281835 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.281885 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mkhd\" (UniqueName: \"kubernetes.io/projected/33bef286-833f-41a6-9770-eec44ebfabd5-kube-api-access-9mkhd\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.281897 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33bef286-833f-41a6-9770-eec44ebfabd5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.492758 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.495290 4958 generic.go:334] "Generic (PLEG): container finished" podID="33bef286-833f-41a6-9770-eec44ebfabd5" containerID="f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431" exitCode=0 Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.495341 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqn68" event={"ID":"33bef286-833f-41a6-9770-eec44ebfabd5","Type":"ContainerDied","Data":"f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431"} Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.495368 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sqn68" event={"ID":"33bef286-833f-41a6-9770-eec44ebfabd5","Type":"ContainerDied","Data":"6ad29136cbc0da8468518f337861ef4c102d29cdf14045469a359efc4f5aa58e"} Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.495389 4958 scope.go:117] "RemoveContainer" containerID="f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.495546 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sqn68" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.519995 4958 scope.go:117] "RemoveContainer" containerID="c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.536798 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqn68"] Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.543120 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sqn68"] Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.552605 4958 scope.go:117] "RemoveContainer" containerID="2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.571005 4958 scope.go:117] "RemoveContainer" containerID="f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431" Dec 01 11:32:07 crc kubenswrapper[4958]: E1201 11:32:07.571464 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431\": container with ID starting with f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431 not found: ID does not exist" containerID="f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.571510 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431"} err="failed to get container status \"f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431\": rpc error: code = NotFound desc = could not find container \"f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431\": container with ID starting with f7bcc13a9c526e321fd71db668e218f56c65bd0ffeb14b051807553b334da431 not found: ID does not exist" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.571539 4958 scope.go:117] "RemoveContainer" containerID="c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a" Dec 01 11:32:07 crc kubenswrapper[4958]: E1201 11:32:07.571860 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a\": container with ID starting with c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a not found: ID does not exist" containerID="c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.571901 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a"} err="failed to get container status \"c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a\": rpc error: code = NotFound desc = could not find container \"c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a\": container with ID starting with c868d2fad1c3f57fb34322e047e18b91805fe8b1a46c9071328873d150fd3d0a not found: ID does not exist" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.571923 4958 scope.go:117] "RemoveContainer" containerID="2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4" Dec 01 11:32:07 crc kubenswrapper[4958]: E1201 11:32:07.621366 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4\": container with ID starting with 2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4 not found: ID does not exist" containerID="2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.621408 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4"} err="failed to get container status \"2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4\": rpc error: code = NotFound desc = could not find container \"2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4\": container with ID starting with 2d6e467a8a84dccbff70e0f68ac83c9fb5c755c596c7bf346c646eddbb5e29b4 not found: ID does not exist" Dec 01 11:32:07 crc kubenswrapper[4958]: I1201 11:32:07.811498 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" path="/var/lib/kubelet/pods/33bef286-833f-41a6-9770-eec44ebfabd5/volumes" Dec 01 11:32:08 crc kubenswrapper[4958]: I1201 11:32:08.505160 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ef1fedf1-c218-452e-83b7-f8eb03827765","Type":"ContainerStarted","Data":"b5ab2cf0b330215648c8e5778f6b7788e2ed8c440232a7d836b821c697f5bf11"} Dec 01 11:32:08 crc kubenswrapper[4958]: I1201 11:32:08.505214 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ef1fedf1-c218-452e-83b7-f8eb03827765","Type":"ContainerStarted","Data":"f824534decc60ffb8b1d04b9169f5ff66be47afe425a1fae2e1c92dc671c66d5"} Dec 01 11:32:08 crc kubenswrapper[4958]: I1201 11:32:08.527695 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.527669709 podStartE2EDuration="3.527669709s" podCreationTimestamp="2025-12-01 11:32:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:08.52203474 +0000 UTC m=+5576.030823787" watchObservedRunningTime="2025-12-01 11:32:08.527669709 +0000 UTC m=+5576.036458746" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.514457 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:11 crc kubenswrapper[4958]: E1201 11:32:11.515676 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="extract-content" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.515703 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="extract-content" Dec 01 11:32:11 crc kubenswrapper[4958]: E1201 11:32:11.515734 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="extract-utilities" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.515747 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="extract-utilities" Dec 01 11:32:11 crc kubenswrapper[4958]: E1201 11:32:11.515769 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="registry-server" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.515781 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="registry-server" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.516068 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bef286-833f-41a6-9770-eec44ebfabd5" containerName="registry-server" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.516908 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.537040 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.620193 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95zp\" (UniqueName: \"kubernetes.io/projected/a9663e90-82ad-4d9e-8e46-69a7feaf9e58-kube-api-access-w95zp\") pod \"mariadb-client\" (UID: \"a9663e90-82ad-4d9e-8e46-69a7feaf9e58\") " pod="openstack/mariadb-client" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.722208 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95zp\" (UniqueName: \"kubernetes.io/projected/a9663e90-82ad-4d9e-8e46-69a7feaf9e58-kube-api-access-w95zp\") pod \"mariadb-client\" (UID: \"a9663e90-82ad-4d9e-8e46-69a7feaf9e58\") " pod="openstack/mariadb-client" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.757250 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95zp\" (UniqueName: \"kubernetes.io/projected/a9663e90-82ad-4d9e-8e46-69a7feaf9e58-kube-api-access-w95zp\") pod \"mariadb-client\" (UID: \"a9663e90-82ad-4d9e-8e46-69a7feaf9e58\") " pod="openstack/mariadb-client" Dec 01 11:32:11 crc kubenswrapper[4958]: I1201 11:32:11.844648 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:12 crc kubenswrapper[4958]: I1201 11:32:12.390729 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:12 crc kubenswrapper[4958]: W1201 11:32:12.391378 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9663e90_82ad_4d9e_8e46_69a7feaf9e58.slice/crio-58cec4fcd55de26fb83745e2603caa4e862d6c6894474f1abd3a6a5f5126c240 WatchSource:0}: Error finding container 58cec4fcd55de26fb83745e2603caa4e862d6c6894474f1abd3a6a5f5126c240: Status 404 returned error can't find the container with id 58cec4fcd55de26fb83745e2603caa4e862d6c6894474f1abd3a6a5f5126c240 Dec 01 11:32:12 crc kubenswrapper[4958]: I1201 11:32:12.565211 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"a9663e90-82ad-4d9e-8e46-69a7feaf9e58","Type":"ContainerStarted","Data":"58cec4fcd55de26fb83745e2603caa4e862d6c6894474f1abd3a6a5f5126c240"} Dec 01 11:32:13 crc kubenswrapper[4958]: I1201 11:32:13.578417 4958 generic.go:334] "Generic (PLEG): container finished" podID="a9663e90-82ad-4d9e-8e46-69a7feaf9e58" containerID="cb5803023318113065b29f3d87e23b5836b4ab13d6acd9e4275542212a1e5e8b" exitCode=0 Dec 01 11:32:13 crc kubenswrapper[4958]: I1201 11:32:13.578684 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"a9663e90-82ad-4d9e-8e46-69a7feaf9e58","Type":"ContainerDied","Data":"cb5803023318113065b29f3d87e23b5836b4ab13d6acd9e4275542212a1e5e8b"} Dec 01 11:32:14 crc kubenswrapper[4958]: I1201 11:32:14.969771 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:14 crc kubenswrapper[4958]: I1201 11:32:14.993769 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_a9663e90-82ad-4d9e-8e46-69a7feaf9e58/mariadb-client/0.log" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.032224 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.045068 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.081718 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95zp\" (UniqueName: \"kubernetes.io/projected/a9663e90-82ad-4d9e-8e46-69a7feaf9e58-kube-api-access-w95zp\") pod \"a9663e90-82ad-4d9e-8e46-69a7feaf9e58\" (UID: \"a9663e90-82ad-4d9e-8e46-69a7feaf9e58\") " Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.089416 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9663e90-82ad-4d9e-8e46-69a7feaf9e58-kube-api-access-w95zp" (OuterVolumeSpecName: "kube-api-access-w95zp") pod "a9663e90-82ad-4d9e-8e46-69a7feaf9e58" (UID: "a9663e90-82ad-4d9e-8e46-69a7feaf9e58"). InnerVolumeSpecName "kube-api-access-w95zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.177370 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:15 crc kubenswrapper[4958]: E1201 11:32:15.178054 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9663e90-82ad-4d9e-8e46-69a7feaf9e58" containerName="mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.178086 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9663e90-82ad-4d9e-8e46-69a7feaf9e58" containerName="mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.178383 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9663e90-82ad-4d9e-8e46-69a7feaf9e58" containerName="mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.179388 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.184058 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95zp\" (UniqueName: \"kubernetes.io/projected/a9663e90-82ad-4d9e-8e46-69a7feaf9e58-kube-api-access-w95zp\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.189066 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.285942 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccgxk\" (UniqueName: \"kubernetes.io/projected/8d4a34b5-64b8-45e7-94c6-692cef0c80b1-kube-api-access-ccgxk\") pod \"mariadb-client\" (UID: \"8d4a34b5-64b8-45e7-94c6-692cef0c80b1\") " pod="openstack/mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.387625 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccgxk\" (UniqueName: \"kubernetes.io/projected/8d4a34b5-64b8-45e7-94c6-692cef0c80b1-kube-api-access-ccgxk\") pod \"mariadb-client\" (UID: \"8d4a34b5-64b8-45e7-94c6-692cef0c80b1\") " pod="openstack/mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.418638 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccgxk\" (UniqueName: \"kubernetes.io/projected/8d4a34b5-64b8-45e7-94c6-692cef0c80b1-kube-api-access-ccgxk\") pod \"mariadb-client\" (UID: \"8d4a34b5-64b8-45e7-94c6-692cef0c80b1\") " pod="openstack/mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.507776 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.598963 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58cec4fcd55de26fb83745e2603caa4e862d6c6894474f1abd3a6a5f5126c240" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.599026 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.628209 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="a9663e90-82ad-4d9e-8e46-69a7feaf9e58" podUID="8d4a34b5-64b8-45e7-94c6-692cef0c80b1" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.811352 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9663e90-82ad-4d9e-8e46-69a7feaf9e58" path="/var/lib/kubelet/pods/a9663e90-82ad-4d9e-8e46-69a7feaf9e58/volumes" Dec 01 11:32:15 crc kubenswrapper[4958]: I1201 11:32:15.956797 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:16 crc kubenswrapper[4958]: I1201 11:32:16.612433 4958 generic.go:334] "Generic (PLEG): container finished" podID="8d4a34b5-64b8-45e7-94c6-692cef0c80b1" containerID="75c76a144b891b4118525d9449f2506d05a5237b3e9f32370f9898a7852ff3f7" exitCode=0 Dec 01 11:32:16 crc kubenswrapper[4958]: I1201 11:32:16.612494 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8d4a34b5-64b8-45e7-94c6-692cef0c80b1","Type":"ContainerDied","Data":"75c76a144b891b4118525d9449f2506d05a5237b3e9f32370f9898a7852ff3f7"} Dec 01 11:32:16 crc kubenswrapper[4958]: I1201 11:32:16.612571 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8d4a34b5-64b8-45e7-94c6-692cef0c80b1","Type":"ContainerStarted","Data":"9e5e9a9c44b172e613e124f7033657faab83670877bc9b822c7d46ff8cacb80a"} Dec 01 11:32:17 crc kubenswrapper[4958]: I1201 11:32:17.961421 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:17 crc kubenswrapper[4958]: I1201 11:32:17.988708 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_8d4a34b5-64b8-45e7-94c6-692cef0c80b1/mariadb-client/0.log" Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.031631 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.040896 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.085551 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccgxk\" (UniqueName: \"kubernetes.io/projected/8d4a34b5-64b8-45e7-94c6-692cef0c80b1-kube-api-access-ccgxk\") pod \"8d4a34b5-64b8-45e7-94c6-692cef0c80b1\" (UID: \"8d4a34b5-64b8-45e7-94c6-692cef0c80b1\") " Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.093070 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4a34b5-64b8-45e7-94c6-692cef0c80b1-kube-api-access-ccgxk" (OuterVolumeSpecName: "kube-api-access-ccgxk") pod "8d4a34b5-64b8-45e7-94c6-692cef0c80b1" (UID: "8d4a34b5-64b8-45e7-94c6-692cef0c80b1"). InnerVolumeSpecName "kube-api-access-ccgxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.187699 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccgxk\" (UniqueName: \"kubernetes.io/projected/8d4a34b5-64b8-45e7-94c6-692cef0c80b1-kube-api-access-ccgxk\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.640991 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e5e9a9c44b172e613e124f7033657faab83670877bc9b822c7d46ff8cacb80a" Dec 01 11:32:18 crc kubenswrapper[4958]: I1201 11:32:18.641102 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 01 11:32:19 crc kubenswrapper[4958]: I1201 11:32:19.815951 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4a34b5-64b8-45e7-94c6-692cef0c80b1" path="/var/lib/kubelet/pods/8d4a34b5-64b8-45e7-94c6-692cef0c80b1/volumes" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.241334 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 11:32:50 crc kubenswrapper[4958]: E1201 11:32:50.242219 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4a34b5-64b8-45e7-94c6-692cef0c80b1" containerName="mariadb-client" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.242239 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4a34b5-64b8-45e7-94c6-692cef0c80b1" containerName="mariadb-client" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.242424 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4a34b5-64b8-45e7-94c6-692cef0c80b1" containerName="mariadb-client" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.243397 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.246913 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.247341 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.251213 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.286352 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pt99s" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.296308 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.298079 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.306583 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.308620 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.314722 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.323314 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.357540 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5035735-49bb-48eb-b42c-6f51d31d98d0-config\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.357602 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5035735-49bb-48eb-b42c-6f51d31d98d0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.357637 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.357941 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5035735-49bb-48eb-b42c-6f51d31d98d0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.358026 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5035735-49bb-48eb-b42c-6f51d31d98d0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.358054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xczs\" (UniqueName: \"kubernetes.io/projected/c5035735-49bb-48eb-b42c-6f51d31d98d0-kube-api-access-8xczs\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.432255 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.434983 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.440165 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mff6f" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.440182 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.440296 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.442832 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.456191 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.458010 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459343 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28a9e245-2ba8-4e4a-bf85-a969c65b6556-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459383 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a9e245-2ba8-4e4a-bf85-a969c65b6556-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459419 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459456 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28a9e245-2ba8-4e4a-bf85-a969c65b6556-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459508 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f235039-02d7-440f-97a5-6c5d8a68089e-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459537 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f235039-02d7-440f-97a5-6c5d8a68089e-config\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459564 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459610 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5035735-49bb-48eb-b42c-6f51d31d98d0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459654 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m77k\" (UniqueName: \"kubernetes.io/projected/2f235039-02d7-440f-97a5-6c5d8a68089e-kube-api-access-2m77k\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459678 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f235039-02d7-440f-97a5-6c5d8a68089e-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459698 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xczs\" (UniqueName: \"kubernetes.io/projected/c5035735-49bb-48eb-b42c-6f51d31d98d0-kube-api-access-8xczs\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459720 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a9e245-2ba8-4e4a-bf85-a969c65b6556-config\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459738 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f235039-02d7-440f-97a5-6c5d8a68089e-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459762 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5035735-49bb-48eb-b42c-6f51d31d98d0-config\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459790 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5035735-49bb-48eb-b42c-6f51d31d98d0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459809 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459872 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcckg\" (UniqueName: \"kubernetes.io/projected/28a9e245-2ba8-4e4a-bf85-a969c65b6556-kube-api-access-zcckg\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.459890 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5035735-49bb-48eb-b42c-6f51d31d98d0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.460370 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c5035735-49bb-48eb-b42c-6f51d31d98d0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.461360 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5035735-49bb-48eb-b42c-6f51d31d98d0-config\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.461772 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5035735-49bb-48eb-b42c-6f51d31d98d0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.467166 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.468704 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.472167 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5035735-49bb-48eb-b42c-6f51d31d98d0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.476729 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.476773 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2dad74d0cffec6297cb684f1523c6015644486edc77821d7f1bb096bd19bb7da/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.489764 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xczs\" (UniqueName: \"kubernetes.io/projected/c5035735-49bb-48eb-b42c-6f51d31d98d0-kube-api-access-8xczs\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.493985 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.500319 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.520631 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ae52c437-a55a-4d5a-9030-f28cd3803ec9\") pod \"ovsdbserver-nb-0\" (UID: \"c5035735-49bb-48eb-b42c-6f51d31d98d0\") " pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.560802 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5a920026-ed51-4397-af95-8c6c279be61a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a920026-ed51-4397-af95-8c6c279be61a\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561082 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561222 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4522d0c9-4a90-4d92-8860-cd95a158c3a6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561335 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4522d0c9-4a90-4d92-8860-cd95a158c3a6-config\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561458 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561574 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fssw\" (UniqueName: \"kubernetes.io/projected/66f03825-e8ba-4c46-bd51-f84d32938fa9-kube-api-access-9fssw\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561704 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561806 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f969edaf-2baf-49a0-8213-a27d66488121\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f969edaf-2baf-49a0-8213-a27d66488121\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.561971 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcckg\" (UniqueName: \"kubernetes.io/projected/28a9e245-2ba8-4e4a-bf85-a969c65b6556-kube-api-access-zcckg\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562089 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66f03825-e8ba-4c46-bd51-f84d32938fa9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562193 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28a9e245-2ba8-4e4a-bf85-a969c65b6556-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562293 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a9e245-2ba8-4e4a-bf85-a969c65b6556-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562409 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f03825-e8ba-4c46-bd51-f84d32938fa9-config\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562652 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f03825-e8ba-4c46-bd51-f84d32938fa9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562748 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28a9e245-2ba8-4e4a-bf85-a969c65b6556-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.562942 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4522d0c9-4a90-4d92-8860-cd95a158c3a6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563069 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563168 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-config\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563284 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4522d0c9-4a90-4d92-8860-cd95a158c3a6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563401 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7kh\" (UniqueName: \"kubernetes.io/projected/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-kube-api-access-dk7kh\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563548 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f235039-02d7-440f-97a5-6c5d8a68089e-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563677 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f235039-02d7-440f-97a5-6c5d8a68089e-config\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563795 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563946 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/66f03825-e8ba-4c46-bd51-f84d32938fa9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.564063 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m77k\" (UniqueName: \"kubernetes.io/projected/2f235039-02d7-440f-97a5-6c5d8a68089e-kube-api-access-2m77k\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.564186 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f235039-02d7-440f-97a5-6c5d8a68089e-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.564306 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7dj\" (UniqueName: \"kubernetes.io/projected/4522d0c9-4a90-4d92-8860-cd95a158c3a6-kube-api-access-hx7dj\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.564427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a9e245-2ba8-4e4a-bf85-a969c65b6556-config\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.564565 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f235039-02d7-440f-97a5-6c5d8a68089e-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.563294 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/28a9e245-2ba8-4e4a-bf85-a969c65b6556-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.565106 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f235039-02d7-440f-97a5-6c5d8a68089e-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.565528 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f235039-02d7-440f-97a5-6c5d8a68089e-config\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.565583 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a9e245-2ba8-4e4a-bf85-a969c65b6556-config\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.565673 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f235039-02d7-440f-97a5-6c5d8a68089e-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.567249 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28a9e245-2ba8-4e4a-bf85-a969c65b6556-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.567354 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f235039-02d7-440f-97a5-6c5d8a68089e-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.568494 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a9e245-2ba8-4e4a-bf85-a969c65b6556-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.568723 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.568832 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3d46786ea2a786bee4a6afe89956d2057f99a1bd55df29c2ada6af9bd29e79a6/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.568753 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.568926 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e6498f734a4ab0a8c6bee2dcd6f70c1ecb539c662bdf7b3cf2223a70da457da9/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.584581 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m77k\" (UniqueName: \"kubernetes.io/projected/2f235039-02d7-440f-97a5-6c5d8a68089e-kube-api-access-2m77k\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.584680 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcckg\" (UniqueName: \"kubernetes.io/projected/28a9e245-2ba8-4e4a-bf85-a969c65b6556-kube-api-access-zcckg\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.597469 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-948c7644-0575-455b-85d3-b8f7aecb84f1\") pod \"ovsdbserver-nb-2\" (UID: \"2f235039-02d7-440f-97a5-6c5d8a68089e\") " pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.598122 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.602074 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-443c1ea3-5c58-4e6b-a4f9-5be77fd72056\") pod \"ovsdbserver-nb-1\" (UID: \"28a9e245-2ba8-4e4a-bf85-a969c65b6556\") " pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.633171 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.641326 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670496 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66f03825-e8ba-4c46-bd51-f84d32938fa9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670608 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f03825-e8ba-4c46-bd51-f84d32938fa9-config\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670663 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f03825-e8ba-4c46-bd51-f84d32938fa9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670700 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4522d0c9-4a90-4d92-8860-cd95a158c3a6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670720 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670744 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-config\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670792 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4522d0c9-4a90-4d92-8860-cd95a158c3a6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670863 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7kh\" (UniqueName: \"kubernetes.io/projected/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-kube-api-access-dk7kh\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.670970 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/66f03825-e8ba-4c46-bd51-f84d32938fa9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7dj\" (UniqueName: \"kubernetes.io/projected/4522d0c9-4a90-4d92-8860-cd95a158c3a6-kube-api-access-hx7dj\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671106 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5a920026-ed51-4397-af95-8c6c279be61a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a920026-ed51-4397-af95-8c6c279be61a\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671148 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671201 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4522d0c9-4a90-4d92-8860-cd95a158c3a6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671220 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4522d0c9-4a90-4d92-8860-cd95a158c3a6-config\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671246 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671283 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fssw\" (UniqueName: \"kubernetes.io/projected/66f03825-e8ba-4c46-bd51-f84d32938fa9-kube-api-access-9fssw\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671319 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671359 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f969edaf-2baf-49a0-8213-a27d66488121\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f969edaf-2baf-49a0-8213-a27d66488121\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.671378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.672030 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66f03825-e8ba-4c46-bd51-f84d32938fa9-config\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.673257 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4522d0c9-4a90-4d92-8860-cd95a158c3a6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.674324 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4522d0c9-4a90-4d92-8860-cd95a158c3a6-config\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.675142 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/66f03825-e8ba-4c46-bd51-f84d32938fa9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.675309 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.676223 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4522d0c9-4a90-4d92-8860-cd95a158c3a6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.676273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66f03825-e8ba-4c46-bd51-f84d32938fa9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.676957 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-config\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.677501 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f03825-e8ba-4c46-bd51-f84d32938fa9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.677954 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4522d0c9-4a90-4d92-8860-cd95a158c3a6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.678403 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.678451 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f969edaf-2baf-49a0-8213-a27d66488121\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f969edaf-2baf-49a0-8213-a27d66488121\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f3ff0aa1ee10d3d1f65f9134eda33eb0f347d0ebc1988c6af906dc73ecf69387/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.679244 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.679270 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5acbb64160caae33c4aad59b16e2977b3931be991ef786803bf832bec06f7c34/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.681483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.682015 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.682046 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5a920026-ed51-4397-af95-8c6c279be61a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a920026-ed51-4397-af95-8c6c279be61a\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5e387c3069059a77df0ae3f19d58de51ced29270711baa7dd31f8c779cf3ba43/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.689196 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fssw\" (UniqueName: \"kubernetes.io/projected/66f03825-e8ba-4c46-bd51-f84d32938fa9-kube-api-access-9fssw\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.691602 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7dj\" (UniqueName: \"kubernetes.io/projected/4522d0c9-4a90-4d92-8860-cd95a158c3a6-kube-api-access-hx7dj\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.697321 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7kh\" (UniqueName: \"kubernetes.io/projected/579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc-kube-api-access-dk7kh\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.739800 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f969edaf-2baf-49a0-8213-a27d66488121\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f969edaf-2baf-49a0-8213-a27d66488121\") pod \"ovsdbserver-sb-0\" (UID: \"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc\") " pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.747271 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-306afe6f-a70b-4c9d-be9d-386c87f58f59\") pod \"ovsdbserver-sb-2\" (UID: \"66f03825-e8ba-4c46-bd51-f84d32938fa9\") " pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.754451 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5a920026-ed51-4397-af95-8c6c279be61a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5a920026-ed51-4397-af95-8c6c279be61a\") pod \"ovsdbserver-sb-1\" (UID: \"4522d0c9-4a90-4d92-8860-cd95a158c3a6\") " pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.770310 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.843916 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:50 crc kubenswrapper[4958]: I1201 11:32:50.846290 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.238333 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.297245 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 01 11:32:51 crc kubenswrapper[4958]: W1201 11:32:51.306230 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f235039_02d7_440f_97a5_6c5d8a68089e.slice/crio-4f15ae37a36cadf6fa200a9ffb8a675127587d484d4a32281abb580441681483 WatchSource:0}: Error finding container 4f15ae37a36cadf6fa200a9ffb8a675127587d484d4a32281abb580441681483: Status 404 returned error can't find the container with id 4f15ae37a36cadf6fa200a9ffb8a675127587d484d4a32281abb580441681483 Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.407467 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.493113 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.970995 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"28a9e245-2ba8-4e4a-bf85-a969c65b6556","Type":"ContainerStarted","Data":"4e8062a7fd1bf4c9ff91a2f19fc411b0a44a41e8363cb81b7af82acac38d6e72"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.971386 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"28a9e245-2ba8-4e4a-bf85-a969c65b6556","Type":"ContainerStarted","Data":"a9a073b201b5375c03d41967054f71216d8341f12c2d96800d0f36bbf02eec39"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.971435 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"28a9e245-2ba8-4e4a-bf85-a969c65b6556","Type":"ContainerStarted","Data":"ef01ed87068d6413fc0015df10ba4100651183e1df3d6d3fb7d1f60af9f22649"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.973793 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"66f03825-e8ba-4c46-bd51-f84d32938fa9","Type":"ContainerStarted","Data":"a99e5325bf05c9027a2ae3f1b0624913ad724068b8584d05cefa62fe22c21841"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.973828 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"66f03825-e8ba-4c46-bd51-f84d32938fa9","Type":"ContainerStarted","Data":"1f629b115de324c1024a31de488a03dadafa2196f1af5954a1cc0b6a6ac044b7"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.973858 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"66f03825-e8ba-4c46-bd51-f84d32938fa9","Type":"ContainerStarted","Data":"7437c7b967de255b96402c40121563ea437c04d42c723c2193b84a1bdd1093f7"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.977246 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c5035735-49bb-48eb-b42c-6f51d31d98d0","Type":"ContainerStarted","Data":"90497ae38a91e8870ab6441b1ba49e8c89787fad4c66b89e2f5aa4ba935107e2"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.977291 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c5035735-49bb-48eb-b42c-6f51d31d98d0","Type":"ContainerStarted","Data":"579a8bc96b84d0831a8ff597ead55495ce0a979ba88241b11f61e471a5382ba2"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.977303 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c5035735-49bb-48eb-b42c-6f51d31d98d0","Type":"ContainerStarted","Data":"e9fc747a991d7d60a44c9f95674d97b98650924d5964a4f6dc09e779284961be"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.982862 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"2f235039-02d7-440f-97a5-6c5d8a68089e","Type":"ContainerStarted","Data":"dc09020750b0ee32c8ed177723f7f08fce35ed1783851d7dfe0b868dda508010"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.982900 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"2f235039-02d7-440f-97a5-6c5d8a68089e","Type":"ContainerStarted","Data":"0bae147c33fa1117a214d6bc7e8264a7d2fdd6b1d4b4ddc90219d422b2684998"} Dec 01 11:32:51 crc kubenswrapper[4958]: I1201 11:32:51.982912 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"2f235039-02d7-440f-97a5-6c5d8a68089e","Type":"ContainerStarted","Data":"4f15ae37a36cadf6fa200a9ffb8a675127587d484d4a32281abb580441681483"} Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.004734 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.00471121 podStartE2EDuration="3.00471121s" podCreationTimestamp="2025-12-01 11:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:51.995452638 +0000 UTC m=+5619.504241705" watchObservedRunningTime="2025-12-01 11:32:52.00471121 +0000 UTC m=+5619.513500257" Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.037155 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.037123218 podStartE2EDuration="3.037123218s" podCreationTimestamp="2025-12-01 11:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:52.01917294 +0000 UTC m=+5619.527961997" watchObservedRunningTime="2025-12-01 11:32:52.037123218 +0000 UTC m=+5619.545912265" Dec 01 11:32:52 crc kubenswrapper[4958]: W1201 11:32:52.050023 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579c7d02_ce33_4b91_b42a_a5d6d8bdf3cc.slice/crio-ecac293f923c937c4e4bee8bfaabd5d445b9f943e0cf00327dd117d4069ed832 WatchSource:0}: Error finding container ecac293f923c937c4e4bee8bfaabd5d445b9f943e0cf00327dd117d4069ed832: Status 404 returned error can't find the container with id ecac293f923c937c4e4bee8bfaabd5d445b9f943e0cf00327dd117d4069ed832 Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.054030 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.056717 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.056697083 podStartE2EDuration="3.056697083s" podCreationTimestamp="2025-12-01 11:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:52.049260182 +0000 UTC m=+5619.558049229" watchObservedRunningTime="2025-12-01 11:32:52.056697083 +0000 UTC m=+5619.565486120" Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.069780 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.069748372 podStartE2EDuration="3.069748372s" podCreationTimestamp="2025-12-01 11:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:52.065516623 +0000 UTC m=+5619.574305660" watchObservedRunningTime="2025-12-01 11:32:52.069748372 +0000 UTC m=+5619.578537409" Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.321972 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 01 11:32:52 crc kubenswrapper[4958]: W1201 11:32:52.326447 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4522d0c9_4a90_4d92_8860_cd95a158c3a6.slice/crio-9b3f6fef9c9f27ec7e61dc30b18a064749d96bf5e1b51a23ef9eaaa87bbf16f0 WatchSource:0}: Error finding container 9b3f6fef9c9f27ec7e61dc30b18a064749d96bf5e1b51a23ef9eaaa87bbf16f0: Status 404 returned error can't find the container with id 9b3f6fef9c9f27ec7e61dc30b18a064749d96bf5e1b51a23ef9eaaa87bbf16f0 Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.996257 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc","Type":"ContainerStarted","Data":"2cf64b6644eaf685d7bf90ae7e618a4f210d495a5a617ff22d99b49be529941a"} Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.997559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc","Type":"ContainerStarted","Data":"07c1311e489982bea470475a9b5c69fe16b889082baa9b70c2dd58572275dd10"} Dec 01 11:32:52 crc kubenswrapper[4958]: I1201 11:32:52.997602 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc","Type":"ContainerStarted","Data":"ecac293f923c937c4e4bee8bfaabd5d445b9f943e0cf00327dd117d4069ed832"} Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:52.999996 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"4522d0c9-4a90-4d92-8860-cd95a158c3a6","Type":"ContainerStarted","Data":"f76e1486c6192f3ebe5bab4a6dd54c71c89c6227dc74ff56913a6cfa6975c4e1"} Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.000060 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"4522d0c9-4a90-4d92-8860-cd95a158c3a6","Type":"ContainerStarted","Data":"88baf6ed175465a62aa2eed705f6b0902650f74c65f98203564beca2dfc9b4c9"} Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.000085 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"4522d0c9-4a90-4d92-8860-cd95a158c3a6","Type":"ContainerStarted","Data":"9b3f6fef9c9f27ec7e61dc30b18a064749d96bf5e1b51a23ef9eaaa87bbf16f0"} Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.038179 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.038153227 podStartE2EDuration="4.038153227s" podCreationTimestamp="2025-12-01 11:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:53.024950063 +0000 UTC m=+5620.533739110" watchObservedRunningTime="2025-12-01 11:32:53.038153227 +0000 UTC m=+5620.546942284" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.060650 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.060623964 podStartE2EDuration="4.060623964s" podCreationTimestamp="2025-12-01 11:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:32:53.057274149 +0000 UTC m=+5620.566063216" watchObservedRunningTime="2025-12-01 11:32:53.060623964 +0000 UTC m=+5620.569413011" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.598271 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.633924 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.642253 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.771160 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.845514 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:53 crc kubenswrapper[4958]: I1201 11:32:53.846612 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:55 crc kubenswrapper[4958]: I1201 11:32:55.598252 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:55 crc kubenswrapper[4958]: I1201 11:32:55.637465 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:55 crc kubenswrapper[4958]: I1201 11:32:55.642035 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:55 crc kubenswrapper[4958]: I1201 11:32:55.770837 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:55 crc kubenswrapper[4958]: I1201 11:32:55.846897 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:55 crc kubenswrapper[4958]: I1201 11:32:55.847019 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.663525 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.717618 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.729448 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.752379 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.772991 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.802198 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.846402 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.907034 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.924784 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:56 crc kubenswrapper[4958]: I1201 11:32:56.970573 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.072884 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-6d6fw"] Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.085667 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.090036 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.091539 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-6d6fw"] Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.117016 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.125931 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.221706 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-dns-svc\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.221795 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.221865 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cljr\" (UniqueName: \"kubernetes.io/projected/1f05f2ec-b786-454e-86a5-506135244863-kube-api-access-7cljr\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.221977 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-config\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.324049 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-dns-svc\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.324123 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.324153 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cljr\" (UniqueName: \"kubernetes.io/projected/1f05f2ec-b786-454e-86a5-506135244863-kube-api-access-7cljr\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.324229 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-config\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.325240 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-config\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.326074 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.326345 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-dns-svc\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.353219 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cljr\" (UniqueName: \"kubernetes.io/projected/1f05f2ec-b786-454e-86a5-506135244863-kube-api-access-7cljr\") pod \"dnsmasq-dns-6f7b485f7-6d6fw\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.403447 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-6d6fw"] Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.404651 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.434257 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ffb5cc57c-hwcl8"] Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.436110 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.439159 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.454437 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ffb5cc57c-hwcl8"] Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.533057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhj8\" (UniqueName: \"kubernetes.io/projected/06419480-e9d9-4e57-9080-1e5aac676f05-kube-api-access-srhj8\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.533381 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-config\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.533442 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-nb\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.533486 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-dns-svc\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.533539 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-sb\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.634825 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-dns-svc\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.634936 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-sb\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.634992 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhj8\" (UniqueName: \"kubernetes.io/projected/06419480-e9d9-4e57-9080-1e5aac676f05-kube-api-access-srhj8\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.635011 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-config\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.635054 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-nb\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.636088 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-nb\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.636586 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-config\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.636748 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-sb\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.637135 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-dns-svc\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.657163 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhj8\" (UniqueName: \"kubernetes.io/projected/06419480-e9d9-4e57-9080-1e5aac676f05-kube-api-access-srhj8\") pod \"dnsmasq-dns-7ffb5cc57c-hwcl8\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.806641 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:32:57 crc kubenswrapper[4958]: I1201 11:32:57.921781 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-6d6fw"] Dec 01 11:32:58 crc kubenswrapper[4958]: I1201 11:32:58.069167 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" event={"ID":"1f05f2ec-b786-454e-86a5-506135244863","Type":"ContainerStarted","Data":"c1ab37e0a6643aeea8801bb1dd89c4b62d1c0c88dc1d5e8ff2a35307c27e773d"} Dec 01 11:32:58 crc kubenswrapper[4958]: I1201 11:32:58.277984 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ffb5cc57c-hwcl8"] Dec 01 11:32:58 crc kubenswrapper[4958]: W1201 11:32:58.285958 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06419480_e9d9_4e57_9080_1e5aac676f05.slice/crio-7b9f5966a137e811bec3f77622c895f573b0c944b7662e8b6a08a2db43f32611 WatchSource:0}: Error finding container 7b9f5966a137e811bec3f77622c895f573b0c944b7662e8b6a08a2db43f32611: Status 404 returned error can't find the container with id 7b9f5966a137e811bec3f77622c895f573b0c944b7662e8b6a08a2db43f32611 Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.084201 4958 generic.go:334] "Generic (PLEG): container finished" podID="06419480-e9d9-4e57-9080-1e5aac676f05" containerID="f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb" exitCode=0 Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.084313 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" event={"ID":"06419480-e9d9-4e57-9080-1e5aac676f05","Type":"ContainerDied","Data":"f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb"} Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.084559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" event={"ID":"06419480-e9d9-4e57-9080-1e5aac676f05","Type":"ContainerStarted","Data":"7b9f5966a137e811bec3f77622c895f573b0c944b7662e8b6a08a2db43f32611"} Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.096032 4958 generic.go:334] "Generic (PLEG): container finished" podID="1f05f2ec-b786-454e-86a5-506135244863" containerID="991eb900a75e3ca1fa821f408a647fd2405d5bda6a4409730cd01dbbb228b4ab" exitCode=0 Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.096090 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" event={"ID":"1f05f2ec-b786-454e-86a5-506135244863","Type":"ContainerDied","Data":"991eb900a75e3ca1fa821f408a647fd2405d5bda6a4409730cd01dbbb228b4ab"} Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.480138 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.566629 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-dns-svc\") pod \"1f05f2ec-b786-454e-86a5-506135244863\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.567039 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-config\") pod \"1f05f2ec-b786-454e-86a5-506135244863\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.567195 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cljr\" (UniqueName: \"kubernetes.io/projected/1f05f2ec-b786-454e-86a5-506135244863-kube-api-access-7cljr\") pod \"1f05f2ec-b786-454e-86a5-506135244863\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.567332 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-ovsdbserver-nb\") pod \"1f05f2ec-b786-454e-86a5-506135244863\" (UID: \"1f05f2ec-b786-454e-86a5-506135244863\") " Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.582419 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f05f2ec-b786-454e-86a5-506135244863-kube-api-access-7cljr" (OuterVolumeSpecName: "kube-api-access-7cljr") pod "1f05f2ec-b786-454e-86a5-506135244863" (UID: "1f05f2ec-b786-454e-86a5-506135244863"). InnerVolumeSpecName "kube-api-access-7cljr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.591902 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f05f2ec-b786-454e-86a5-506135244863" (UID: "1f05f2ec-b786-454e-86a5-506135244863"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.595151 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f05f2ec-b786-454e-86a5-506135244863" (UID: "1f05f2ec-b786-454e-86a5-506135244863"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.606535 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-config" (OuterVolumeSpecName: "config") pod "1f05f2ec-b786-454e-86a5-506135244863" (UID: "1f05f2ec-b786-454e-86a5-506135244863"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.672475 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.672510 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.672520 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f05f2ec-b786-454e-86a5-506135244863-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.672530 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cljr\" (UniqueName: \"kubernetes.io/projected/1f05f2ec-b786-454e-86a5-506135244863-kube-api-access-7cljr\") on node \"crc\" DevicePath \"\"" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.831319 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 01 11:32:59 crc kubenswrapper[4958]: E1201 11:32:59.831863 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f05f2ec-b786-454e-86a5-506135244863" containerName="init" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.831890 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f05f2ec-b786-454e-86a5-506135244863" containerName="init" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.832134 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f05f2ec-b786-454e-86a5-506135244863" containerName="init" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.833173 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.838140 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.861283 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.980765 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpz22\" (UniqueName: \"kubernetes.io/projected/56957e12-7edf-40ac-accd-bb5f1997e0ab-kube-api-access-lpz22\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.981348 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:32:59 crc kubenswrapper[4958]: I1201 11:32:59.981622 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56957e12-7edf-40ac-accd-bb5f1997e0ab-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.082735 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.082887 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56957e12-7edf-40ac-accd-bb5f1997e0ab-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.083013 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpz22\" (UniqueName: \"kubernetes.io/projected/56957e12-7edf-40ac-accd-bb5f1997e0ab-kube-api-access-lpz22\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.087153 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.087222 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6570a7a4703af0e6402c1f38c7d2aa26d99774af8bf25ed8c92c2295eda70451/globalmount\"" pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.091768 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56957e12-7edf-40ac-accd-bb5f1997e0ab-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.114188 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" event={"ID":"1f05f2ec-b786-454e-86a5-506135244863","Type":"ContainerDied","Data":"c1ab37e0a6643aeea8801bb1dd89c4b62d1c0c88dc1d5e8ff2a35307c27e773d"} Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.114256 4958 scope.go:117] "RemoveContainer" containerID="991eb900a75e3ca1fa821f408a647fd2405d5bda6a4409730cd01dbbb228b4ab" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.114752 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-6d6fw" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.116581 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpz22\" (UniqueName: \"kubernetes.io/projected/56957e12-7edf-40ac-accd-bb5f1997e0ab-kube-api-access-lpz22\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.119115 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" event={"ID":"06419480-e9d9-4e57-9080-1e5aac676f05","Type":"ContainerStarted","Data":"a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba"} Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.119723 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.136911 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") pod \"ovn-copy-data\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.157891 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" podStartSLOduration=3.157872346 podStartE2EDuration="3.157872346s" podCreationTimestamp="2025-12-01 11:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:00.145936048 +0000 UTC m=+5627.654725095" watchObservedRunningTime="2025-12-01 11:33:00.157872346 +0000 UTC m=+5627.666661383" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.159750 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.207643 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-6d6fw"] Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.216176 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-6d6fw"] Dec 01 11:33:00 crc kubenswrapper[4958]: I1201 11:33:00.791251 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 01 11:33:01 crc kubenswrapper[4958]: I1201 11:33:01.136063 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56957e12-7edf-40ac-accd-bb5f1997e0ab","Type":"ContainerStarted","Data":"52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb"} Dec 01 11:33:01 crc kubenswrapper[4958]: I1201 11:33:01.136138 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56957e12-7edf-40ac-accd-bb5f1997e0ab","Type":"ContainerStarted","Data":"7aa8b780f15d705b52ea62050fd99bfc8e342daaed08b90dcc7792661aaec969"} Dec 01 11:33:01 crc kubenswrapper[4958]: I1201 11:33:01.154706 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.154684645 podStartE2EDuration="3.154684645s" podCreationTimestamp="2025-12-01 11:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:01.152776841 +0000 UTC m=+5628.661565878" watchObservedRunningTime="2025-12-01 11:33:01.154684645 +0000 UTC m=+5628.663473682" Dec 01 11:33:01 crc kubenswrapper[4958]: I1201 11:33:01.817738 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f05f2ec-b786-454e-86a5-506135244863" path="/var/lib/kubelet/pods/1f05f2ec-b786-454e-86a5-506135244863/volumes" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.369888 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.374638 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.378493 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8ll2w" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.379571 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.379742 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.390398 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.558243 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644cabd8-e449-4d33-96ea-8d76b59b0157-config\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.558550 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/644cabd8-e449-4d33-96ea-8d76b59b0157-scripts\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.558642 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/644cabd8-e449-4d33-96ea-8d76b59b0157-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.558689 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlg6\" (UniqueName: \"kubernetes.io/projected/644cabd8-e449-4d33-96ea-8d76b59b0157-kube-api-access-qrlg6\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.558734 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cabd8-e449-4d33-96ea-8d76b59b0157-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.660221 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/644cabd8-e449-4d33-96ea-8d76b59b0157-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.660307 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlg6\" (UniqueName: \"kubernetes.io/projected/644cabd8-e449-4d33-96ea-8d76b59b0157-kube-api-access-qrlg6\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.660367 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cabd8-e449-4d33-96ea-8d76b59b0157-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.660427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644cabd8-e449-4d33-96ea-8d76b59b0157-config\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.660451 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/644cabd8-e449-4d33-96ea-8d76b59b0157-scripts\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.661009 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/644cabd8-e449-4d33-96ea-8d76b59b0157-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.661516 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/644cabd8-e449-4d33-96ea-8d76b59b0157-scripts\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.661823 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/644cabd8-e449-4d33-96ea-8d76b59b0157-config\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.672786 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cabd8-e449-4d33-96ea-8d76b59b0157-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.679938 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlg6\" (UniqueName: \"kubernetes.io/projected/644cabd8-e449-4d33-96ea-8d76b59b0157-kube-api-access-qrlg6\") pod \"ovn-northd-0\" (UID: \"644cabd8-e449-4d33-96ea-8d76b59b0157\") " pod="openstack/ovn-northd-0" Dec 01 11:33:06 crc kubenswrapper[4958]: I1201 11:33:06.704854 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 11:33:07 crc kubenswrapper[4958]: I1201 11:33:07.116959 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 11:33:07 crc kubenswrapper[4958]: I1201 11:33:07.203889 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"644cabd8-e449-4d33-96ea-8d76b59b0157","Type":"ContainerStarted","Data":"44cd35337462700c2327c97008cb66c736bb1b0eafd5c4da8b130b853d2e4d4f"} Dec 01 11:33:07 crc kubenswrapper[4958]: I1201 11:33:07.808968 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:33:07 crc kubenswrapper[4958]: I1201 11:33:07.891705 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pm49z"] Dec 01 11:33:07 crc kubenswrapper[4958]: I1201 11:33:07.892005 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerName="dnsmasq-dns" containerID="cri-o://e0a76210329b60b9b44cfb4873ba156cecc26ae8030b934f643d8f34e9621ab7" gracePeriod=10 Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.215566 4958 generic.go:334] "Generic (PLEG): container finished" podID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerID="e0a76210329b60b9b44cfb4873ba156cecc26ae8030b934f643d8f34e9621ab7" exitCode=0 Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.215662 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" event={"ID":"1c4a10a2-79f1-40f5-9061-0c0d8626a150","Type":"ContainerDied","Data":"e0a76210329b60b9b44cfb4873ba156cecc26ae8030b934f643d8f34e9621ab7"} Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.218188 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"644cabd8-e449-4d33-96ea-8d76b59b0157","Type":"ContainerStarted","Data":"fd9af51950847e47f2feaa831d7495cd0bdd0d049566fa5ba0046a7fdf228665"} Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.218236 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"644cabd8-e449-4d33-96ea-8d76b59b0157","Type":"ContainerStarted","Data":"70c7acb66100a70140d8c42ba740dc4347536e6aa05790251b1faa42e7507d49"} Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.219437 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.249564 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.249538561 podStartE2EDuration="2.249538561s" podCreationTimestamp="2025-12-01 11:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:08.23928428 +0000 UTC m=+5635.748073317" watchObservedRunningTime="2025-12-01 11:33:08.249538561 +0000 UTC m=+5635.758327598" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.359550 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.496198 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-config\") pod \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.496272 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgq6m\" (UniqueName: \"kubernetes.io/projected/1c4a10a2-79f1-40f5-9061-0c0d8626a150-kube-api-access-jgq6m\") pod \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.496377 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-dns-svc\") pod \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\" (UID: \"1c4a10a2-79f1-40f5-9061-0c0d8626a150\") " Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.506251 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4a10a2-79f1-40f5-9061-0c0d8626a150-kube-api-access-jgq6m" (OuterVolumeSpecName: "kube-api-access-jgq6m") pod "1c4a10a2-79f1-40f5-9061-0c0d8626a150" (UID: "1c4a10a2-79f1-40f5-9061-0c0d8626a150"). InnerVolumeSpecName "kube-api-access-jgq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.546425 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c4a10a2-79f1-40f5-9061-0c0d8626a150" (UID: "1c4a10a2-79f1-40f5-9061-0c0d8626a150"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.551686 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-config" (OuterVolumeSpecName: "config") pod "1c4a10a2-79f1-40f5-9061-0c0d8626a150" (UID: "1c4a10a2-79f1-40f5-9061-0c0d8626a150"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.598387 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.598415 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c4a10a2-79f1-40f5-9061-0c0d8626a150-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:08 crc kubenswrapper[4958]: I1201 11:33:08.598440 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgq6m\" (UniqueName: \"kubernetes.io/projected/1c4a10a2-79f1-40f5-9061-0c0d8626a150-kube-api-access-jgq6m\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.230143 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" event={"ID":"1c4a10a2-79f1-40f5-9061-0c0d8626a150","Type":"ContainerDied","Data":"a9dbc4e4d00f537f935cd9eadd8d528422d5a4555a2b6023985043c7e4793682"} Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.230197 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-pm49z" Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.230246 4958 scope.go:117] "RemoveContainer" containerID="e0a76210329b60b9b44cfb4873ba156cecc26ae8030b934f643d8f34e9621ab7" Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.263974 4958 scope.go:117] "RemoveContainer" containerID="da77a2be8453461a4e3955f3670ef87a6cbd8fee93e8cccf261f766b31c957b2" Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.312643 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pm49z"] Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.324991 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-pm49z"] Dec 01 11:33:09 crc kubenswrapper[4958]: I1201 11:33:09.814066 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" path="/var/lib/kubelet/pods/1c4a10a2-79f1-40f5-9061-0c0d8626a150/volumes" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.378396 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lz786"] Dec 01 11:33:11 crc kubenswrapper[4958]: E1201 11:33:11.379611 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerName="init" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.379710 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerName="init" Dec 01 11:33:11 crc kubenswrapper[4958]: E1201 11:33:11.379821 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerName="dnsmasq-dns" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.379931 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerName="dnsmasq-dns" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.380190 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4a10a2-79f1-40f5-9061-0c0d8626a150" containerName="dnsmasq-dns" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.381037 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lz786" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.390328 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lz786"] Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.558592 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwhvz\" (UniqueName: \"kubernetes.io/projected/c6bd4921-dbde-4dc2-9396-c9892bba83cc-kube-api-access-rwhvz\") pod \"keystone-db-create-lz786\" (UID: \"c6bd4921-dbde-4dc2-9396-c9892bba83cc\") " pod="openstack/keystone-db-create-lz786" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.660464 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwhvz\" (UniqueName: \"kubernetes.io/projected/c6bd4921-dbde-4dc2-9396-c9892bba83cc-kube-api-access-rwhvz\") pod \"keystone-db-create-lz786\" (UID: \"c6bd4921-dbde-4dc2-9396-c9892bba83cc\") " pod="openstack/keystone-db-create-lz786" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.680513 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwhvz\" (UniqueName: \"kubernetes.io/projected/c6bd4921-dbde-4dc2-9396-c9892bba83cc-kube-api-access-rwhvz\") pod \"keystone-db-create-lz786\" (UID: \"c6bd4921-dbde-4dc2-9396-c9892bba83cc\") " pod="openstack/keystone-db-create-lz786" Dec 01 11:33:11 crc kubenswrapper[4958]: I1201 11:33:11.762496 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lz786" Dec 01 11:33:12 crc kubenswrapper[4958]: I1201 11:33:12.266885 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lz786"] Dec 01 11:33:12 crc kubenswrapper[4958]: W1201 11:33:12.278438 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6bd4921_dbde_4dc2_9396_c9892bba83cc.slice/crio-708d28b38d0c5f33674f780aaa17ea65349b36771438bb3fbc1796333d01c8d6 WatchSource:0}: Error finding container 708d28b38d0c5f33674f780aaa17ea65349b36771438bb3fbc1796333d01c8d6: Status 404 returned error can't find the container with id 708d28b38d0c5f33674f780aaa17ea65349b36771438bb3fbc1796333d01c8d6 Dec 01 11:33:13 crc kubenswrapper[4958]: I1201 11:33:13.277615 4958 generic.go:334] "Generic (PLEG): container finished" podID="c6bd4921-dbde-4dc2-9396-c9892bba83cc" containerID="7832d518f10a8ce29b134e30b4e0bac59d73d3813683aa11fac3457f0effadaa" exitCode=0 Dec 01 11:33:13 crc kubenswrapper[4958]: I1201 11:33:13.277712 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lz786" event={"ID":"c6bd4921-dbde-4dc2-9396-c9892bba83cc","Type":"ContainerDied","Data":"7832d518f10a8ce29b134e30b4e0bac59d73d3813683aa11fac3457f0effadaa"} Dec 01 11:33:13 crc kubenswrapper[4958]: I1201 11:33:13.278167 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lz786" event={"ID":"c6bd4921-dbde-4dc2-9396-c9892bba83cc","Type":"ContainerStarted","Data":"708d28b38d0c5f33674f780aaa17ea65349b36771438bb3fbc1796333d01c8d6"} Dec 01 11:33:14 crc kubenswrapper[4958]: I1201 11:33:14.676156 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lz786" Dec 01 11:33:14 crc kubenswrapper[4958]: I1201 11:33:14.843535 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwhvz\" (UniqueName: \"kubernetes.io/projected/c6bd4921-dbde-4dc2-9396-c9892bba83cc-kube-api-access-rwhvz\") pod \"c6bd4921-dbde-4dc2-9396-c9892bba83cc\" (UID: \"c6bd4921-dbde-4dc2-9396-c9892bba83cc\") " Dec 01 11:33:14 crc kubenswrapper[4958]: I1201 11:33:14.853459 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bd4921-dbde-4dc2-9396-c9892bba83cc-kube-api-access-rwhvz" (OuterVolumeSpecName: "kube-api-access-rwhvz") pod "c6bd4921-dbde-4dc2-9396-c9892bba83cc" (UID: "c6bd4921-dbde-4dc2-9396-c9892bba83cc"). InnerVolumeSpecName "kube-api-access-rwhvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:14 crc kubenswrapper[4958]: I1201 11:33:14.946910 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwhvz\" (UniqueName: \"kubernetes.io/projected/c6bd4921-dbde-4dc2-9396-c9892bba83cc-kube-api-access-rwhvz\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:15 crc kubenswrapper[4958]: I1201 11:33:15.305103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lz786" event={"ID":"c6bd4921-dbde-4dc2-9396-c9892bba83cc","Type":"ContainerDied","Data":"708d28b38d0c5f33674f780aaa17ea65349b36771438bb3fbc1796333d01c8d6"} Dec 01 11:33:15 crc kubenswrapper[4958]: I1201 11:33:15.305165 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="708d28b38d0c5f33674f780aaa17ea65349b36771438bb3fbc1796333d01c8d6" Dec 01 11:33:15 crc kubenswrapper[4958]: I1201 11:33:15.305302 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lz786" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.409072 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4ace-account-create-gpn2t"] Dec 01 11:33:21 crc kubenswrapper[4958]: E1201 11:33:21.410587 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bd4921-dbde-4dc2-9396-c9892bba83cc" containerName="mariadb-database-create" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.410607 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bd4921-dbde-4dc2-9396-c9892bba83cc" containerName="mariadb-database-create" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.411094 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bd4921-dbde-4dc2-9396-c9892bba83cc" containerName="mariadb-database-create" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.411960 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.417367 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4ace-account-create-gpn2t"] Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.420379 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.484252 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbscp\" (UniqueName: \"kubernetes.io/projected/d392c2fc-42de-4446-98e7-9f6cd36b696d-kube-api-access-fbscp\") pod \"keystone-4ace-account-create-gpn2t\" (UID: \"d392c2fc-42de-4446-98e7-9f6cd36b696d\") " pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.586617 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbscp\" (UniqueName: \"kubernetes.io/projected/d392c2fc-42de-4446-98e7-9f6cd36b696d-kube-api-access-fbscp\") pod \"keystone-4ace-account-create-gpn2t\" (UID: \"d392c2fc-42de-4446-98e7-9f6cd36b696d\") " pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.623092 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbscp\" (UniqueName: \"kubernetes.io/projected/d392c2fc-42de-4446-98e7-9f6cd36b696d-kube-api-access-fbscp\") pod \"keystone-4ace-account-create-gpn2t\" (UID: \"d392c2fc-42de-4446-98e7-9f6cd36b696d\") " pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.743376 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:21 crc kubenswrapper[4958]: I1201 11:33:21.811833 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 11:33:22 crc kubenswrapper[4958]: I1201 11:33:22.250536 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4ace-account-create-gpn2t"] Dec 01 11:33:22 crc kubenswrapper[4958]: W1201 11:33:22.263620 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd392c2fc_42de_4446_98e7_9f6cd36b696d.slice/crio-fc28fa921e8e2d7d6c441ef13837c0f38272032dd7f1fa03d715d5d2d11c5d9d WatchSource:0}: Error finding container fc28fa921e8e2d7d6c441ef13837c0f38272032dd7f1fa03d715d5d2d11c5d9d: Status 404 returned error can't find the container with id fc28fa921e8e2d7d6c441ef13837c0f38272032dd7f1fa03d715d5d2d11c5d9d Dec 01 11:33:22 crc kubenswrapper[4958]: I1201 11:33:22.399109 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4ace-account-create-gpn2t" event={"ID":"d392c2fc-42de-4446-98e7-9f6cd36b696d","Type":"ContainerStarted","Data":"fc28fa921e8e2d7d6c441ef13837c0f38272032dd7f1fa03d715d5d2d11c5d9d"} Dec 01 11:33:23 crc kubenswrapper[4958]: I1201 11:33:23.187555 4958 scope.go:117] "RemoveContainer" containerID="9e735b1d505d442db7fae37e4dc51b628aa167567ff830189b6cc0641f3acb9f" Dec 01 11:33:23 crc kubenswrapper[4958]: I1201 11:33:23.215814 4958 scope.go:117] "RemoveContainer" containerID="278e2d5adfc41e35aea2a997b38492f0d38081915dd7cf85526ef3b49063c666" Dec 01 11:33:23 crc kubenswrapper[4958]: I1201 11:33:23.408597 4958 generic.go:334] "Generic (PLEG): container finished" podID="d392c2fc-42de-4446-98e7-9f6cd36b696d" containerID="c8c7f98ebcd23e516dd019572137279d96813d84f44c473e20c231f4f1764345" exitCode=0 Dec 01 11:33:23 crc kubenswrapper[4958]: I1201 11:33:23.408660 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4ace-account-create-gpn2t" event={"ID":"d392c2fc-42de-4446-98e7-9f6cd36b696d","Type":"ContainerDied","Data":"c8c7f98ebcd23e516dd019572137279d96813d84f44c473e20c231f4f1764345"} Dec 01 11:33:24 crc kubenswrapper[4958]: I1201 11:33:24.850502 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:24 crc kubenswrapper[4958]: I1201 11:33:24.981989 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbscp\" (UniqueName: \"kubernetes.io/projected/d392c2fc-42de-4446-98e7-9f6cd36b696d-kube-api-access-fbscp\") pod \"d392c2fc-42de-4446-98e7-9f6cd36b696d\" (UID: \"d392c2fc-42de-4446-98e7-9f6cd36b696d\") " Dec 01 11:33:24 crc kubenswrapper[4958]: I1201 11:33:24.988219 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d392c2fc-42de-4446-98e7-9f6cd36b696d-kube-api-access-fbscp" (OuterVolumeSpecName: "kube-api-access-fbscp") pod "d392c2fc-42de-4446-98e7-9f6cd36b696d" (UID: "d392c2fc-42de-4446-98e7-9f6cd36b696d"). InnerVolumeSpecName "kube-api-access-fbscp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:25 crc kubenswrapper[4958]: I1201 11:33:25.084907 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbscp\" (UniqueName: \"kubernetes.io/projected/d392c2fc-42de-4446-98e7-9f6cd36b696d-kube-api-access-fbscp\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:25 crc kubenswrapper[4958]: I1201 11:33:25.447140 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4ace-account-create-gpn2t" event={"ID":"d392c2fc-42de-4446-98e7-9f6cd36b696d","Type":"ContainerDied","Data":"fc28fa921e8e2d7d6c441ef13837c0f38272032dd7f1fa03d715d5d2d11c5d9d"} Dec 01 11:33:25 crc kubenswrapper[4958]: I1201 11:33:25.447211 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc28fa921e8e2d7d6c441ef13837c0f38272032dd7f1fa03d715d5d2d11c5d9d" Dec 01 11:33:25 crc kubenswrapper[4958]: I1201 11:33:25.447248 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ace-account-create-gpn2t" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.891163 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7g42p"] Dec 01 11:33:26 crc kubenswrapper[4958]: E1201 11:33:26.891687 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d392c2fc-42de-4446-98e7-9f6cd36b696d" containerName="mariadb-account-create" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.891705 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d392c2fc-42de-4446-98e7-9f6cd36b696d" containerName="mariadb-account-create" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.892231 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d392c2fc-42de-4446-98e7-9f6cd36b696d" containerName="mariadb-account-create" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.893040 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.904423 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.905028 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.905178 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qtvq7" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.905370 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 11:33:26 crc kubenswrapper[4958]: I1201 11:33:26.924241 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7g42p"] Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.022958 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-combined-ca-bundle\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.023040 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pqhx\" (UniqueName: \"kubernetes.io/projected/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-kube-api-access-5pqhx\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.023548 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-config-data\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.125717 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-config-data\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.125835 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-combined-ca-bundle\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.125926 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pqhx\" (UniqueName: \"kubernetes.io/projected/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-kube-api-access-5pqhx\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.130437 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-config-data\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.130525 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-combined-ca-bundle\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.144103 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pqhx\" (UniqueName: \"kubernetes.io/projected/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-kube-api-access-5pqhx\") pod \"keystone-db-sync-7g42p\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.220630 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:27 crc kubenswrapper[4958]: I1201 11:33:27.687255 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7g42p"] Dec 01 11:33:28 crc kubenswrapper[4958]: I1201 11:33:28.221794 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:33:28 crc kubenswrapper[4958]: I1201 11:33:28.221880 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:33:28 crc kubenswrapper[4958]: I1201 11:33:28.479537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7g42p" event={"ID":"7d28e16e-7406-4b48-be5b-8afb83ac7e5b","Type":"ContainerStarted","Data":"0c01c075269e1b45d04c4992417487ffb1df8638ff12160ee591c14a222803fb"} Dec 01 11:33:28 crc kubenswrapper[4958]: I1201 11:33:28.479996 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7g42p" event={"ID":"7d28e16e-7406-4b48-be5b-8afb83ac7e5b","Type":"ContainerStarted","Data":"424c77ff0fb31123670880f4fff710e7d5f0bfdafcb087ffa4978245f2177d6d"} Dec 01 11:33:28 crc kubenswrapper[4958]: I1201 11:33:28.504254 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7g42p" podStartSLOduration=2.504211549 podStartE2EDuration="2.504211549s" podCreationTimestamp="2025-12-01 11:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:28.494118783 +0000 UTC m=+5656.002907830" watchObservedRunningTime="2025-12-01 11:33:28.504211549 +0000 UTC m=+5656.013000596" Dec 01 11:33:30 crc kubenswrapper[4958]: I1201 11:33:30.498565 4958 generic.go:334] "Generic (PLEG): container finished" podID="7d28e16e-7406-4b48-be5b-8afb83ac7e5b" containerID="0c01c075269e1b45d04c4992417487ffb1df8638ff12160ee591c14a222803fb" exitCode=0 Dec 01 11:33:30 crc kubenswrapper[4958]: I1201 11:33:30.498644 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7g42p" event={"ID":"7d28e16e-7406-4b48-be5b-8afb83ac7e5b","Type":"ContainerDied","Data":"0c01c075269e1b45d04c4992417487ffb1df8638ff12160ee591c14a222803fb"} Dec 01 11:33:31 crc kubenswrapper[4958]: I1201 11:33:31.983013 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.149683 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-combined-ca-bundle\") pod \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.149816 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pqhx\" (UniqueName: \"kubernetes.io/projected/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-kube-api-access-5pqhx\") pod \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.149981 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-config-data\") pod \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\" (UID: \"7d28e16e-7406-4b48-be5b-8afb83ac7e5b\") " Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.156426 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-kube-api-access-5pqhx" (OuterVolumeSpecName: "kube-api-access-5pqhx") pod "7d28e16e-7406-4b48-be5b-8afb83ac7e5b" (UID: "7d28e16e-7406-4b48-be5b-8afb83ac7e5b"). InnerVolumeSpecName "kube-api-access-5pqhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.191172 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d28e16e-7406-4b48-be5b-8afb83ac7e5b" (UID: "7d28e16e-7406-4b48-be5b-8afb83ac7e5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.226016 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-config-data" (OuterVolumeSpecName: "config-data") pod "7d28e16e-7406-4b48-be5b-8afb83ac7e5b" (UID: "7d28e16e-7406-4b48-be5b-8afb83ac7e5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.252257 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pqhx\" (UniqueName: \"kubernetes.io/projected/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-kube-api-access-5pqhx\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.252331 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.252359 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d28e16e-7406-4b48-be5b-8afb83ac7e5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.531811 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7g42p" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.532027 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7g42p" event={"ID":"7d28e16e-7406-4b48-be5b-8afb83ac7e5b","Type":"ContainerDied","Data":"424c77ff0fb31123670880f4fff710e7d5f0bfdafcb087ffa4978245f2177d6d"} Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.532173 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="424c77ff0fb31123670880f4fff710e7d5f0bfdafcb087ffa4978245f2177d6d" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.769549 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-779b6bf5fc-xbklh"] Dec 01 11:33:32 crc kubenswrapper[4958]: E1201 11:33:32.770105 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d28e16e-7406-4b48-be5b-8afb83ac7e5b" containerName="keystone-db-sync" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.770131 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d28e16e-7406-4b48-be5b-8afb83ac7e5b" containerName="keystone-db-sync" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.770347 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d28e16e-7406-4b48-be5b-8afb83ac7e5b" containerName="keystone-db-sync" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.771512 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.790150 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779b6bf5fc-xbklh"] Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.842172 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b66b2"] Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.844399 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.850232 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.850320 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.850247 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.850659 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qtvq7" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.853352 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b66b2"] Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.868440 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-sb\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.868545 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-dns-svc\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.868590 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-nb\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.868628 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhh5\" (UniqueName: \"kubernetes.io/projected/f063f400-6953-430b-8035-7c648b20d92e-kube-api-access-6nhh5\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.868654 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-config\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970279 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-scripts\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970582 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-dns-svc\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970632 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-nb\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970653 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58hvf\" (UniqueName: \"kubernetes.io/projected/c08f909e-e70c-4a83-ac67-89d707cef580-kube-api-access-58hvf\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970682 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhh5\" (UniqueName: \"kubernetes.io/projected/f063f400-6953-430b-8035-7c648b20d92e-kube-api-access-6nhh5\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970703 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-config\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970748 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-config-data\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.970934 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-combined-ca-bundle\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.971079 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-fernet-keys\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.971116 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-credential-keys\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.971313 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-sb\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.971576 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-nb\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.971926 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-config\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.972307 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-sb\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.972439 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-dns-svc\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:32 crc kubenswrapper[4958]: I1201 11:33:32.992815 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhh5\" (UniqueName: \"kubernetes.io/projected/f063f400-6953-430b-8035-7c648b20d92e-kube-api-access-6nhh5\") pod \"dnsmasq-dns-779b6bf5fc-xbklh\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.072923 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58hvf\" (UniqueName: \"kubernetes.io/projected/c08f909e-e70c-4a83-ac67-89d707cef580-kube-api-access-58hvf\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.073032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-config-data\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.073068 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-combined-ca-bundle\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.073095 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-fernet-keys\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.073149 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-credential-keys\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.073242 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-scripts\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.077094 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-scripts\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.077711 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-combined-ca-bundle\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.078132 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-config-data\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.079746 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-credential-keys\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.081561 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-fernet-keys\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.091039 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58hvf\" (UniqueName: \"kubernetes.io/projected/c08f909e-e70c-4a83-ac67-89d707cef580-kube-api-access-58hvf\") pod \"keystone-bootstrap-b66b2\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.107189 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.163443 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.604378 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779b6bf5fc-xbklh"] Dec 01 11:33:33 crc kubenswrapper[4958]: W1201 11:33:33.612160 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf063f400_6953_430b_8035_7c648b20d92e.slice/crio-4b67cc2718daff9aa87937b198d5d55420fb8010d5200d71a4b7ec2df73cbdd4 WatchSource:0}: Error finding container 4b67cc2718daff9aa87937b198d5d55420fb8010d5200d71a4b7ec2df73cbdd4: Status 404 returned error can't find the container with id 4b67cc2718daff9aa87937b198d5d55420fb8010d5200d71a4b7ec2df73cbdd4 Dec 01 11:33:33 crc kubenswrapper[4958]: I1201 11:33:33.690401 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b66b2"] Dec 01 11:33:34 crc kubenswrapper[4958]: I1201 11:33:34.554414 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b66b2" event={"ID":"c08f909e-e70c-4a83-ac67-89d707cef580","Type":"ContainerStarted","Data":"39460ecd53f06e7a2d28c12c68cdceca45b831240cc69bd129fd9ec640c469b2"} Dec 01 11:33:34 crc kubenswrapper[4958]: I1201 11:33:34.554792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b66b2" event={"ID":"c08f909e-e70c-4a83-ac67-89d707cef580","Type":"ContainerStarted","Data":"214b9bd2aa43fed4fe33586a4492684c2fd31dffa12904ea5d5ded5a1249e272"} Dec 01 11:33:34 crc kubenswrapper[4958]: I1201 11:33:34.560153 4958 generic.go:334] "Generic (PLEG): container finished" podID="f063f400-6953-430b-8035-7c648b20d92e" containerID="f6133af47d16a674549c8df0995206eda43be7d05b2a267e9b658ea3c9105912" exitCode=0 Dec 01 11:33:34 crc kubenswrapper[4958]: I1201 11:33:34.560194 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" event={"ID":"f063f400-6953-430b-8035-7c648b20d92e","Type":"ContainerDied","Data":"f6133af47d16a674549c8df0995206eda43be7d05b2a267e9b658ea3c9105912"} Dec 01 11:33:34 crc kubenswrapper[4958]: I1201 11:33:34.560216 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" event={"ID":"f063f400-6953-430b-8035-7c648b20d92e","Type":"ContainerStarted","Data":"4b67cc2718daff9aa87937b198d5d55420fb8010d5200d71a4b7ec2df73cbdd4"} Dec 01 11:33:34 crc kubenswrapper[4958]: I1201 11:33:34.599381 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b66b2" podStartSLOduration=2.5993540619999997 podStartE2EDuration="2.599354062s" podCreationTimestamp="2025-12-01 11:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:34.573573841 +0000 UTC m=+5662.082362878" watchObservedRunningTime="2025-12-01 11:33:34.599354062 +0000 UTC m=+5662.108143109" Dec 01 11:33:35 crc kubenswrapper[4958]: I1201 11:33:35.615800 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" event={"ID":"f063f400-6953-430b-8035-7c648b20d92e","Type":"ContainerStarted","Data":"b1a6d7044cd3639bea39d4c8a8ae8da85a5c3edc7277a63e841e896c81e7db36"} Dec 01 11:33:35 crc kubenswrapper[4958]: I1201 11:33:35.617196 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:35 crc kubenswrapper[4958]: I1201 11:33:35.650672 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" podStartSLOduration=3.650648175 podStartE2EDuration="3.650648175s" podCreationTimestamp="2025-12-01 11:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:35.648966157 +0000 UTC m=+5663.157755194" watchObservedRunningTime="2025-12-01 11:33:35.650648175 +0000 UTC m=+5663.159437232" Dec 01 11:33:37 crc kubenswrapper[4958]: I1201 11:33:37.631171 4958 generic.go:334] "Generic (PLEG): container finished" podID="c08f909e-e70c-4a83-ac67-89d707cef580" containerID="39460ecd53f06e7a2d28c12c68cdceca45b831240cc69bd129fd9ec640c469b2" exitCode=0 Dec 01 11:33:37 crc kubenswrapper[4958]: I1201 11:33:37.631246 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b66b2" event={"ID":"c08f909e-e70c-4a83-ac67-89d707cef580","Type":"ContainerDied","Data":"39460ecd53f06e7a2d28c12c68cdceca45b831240cc69bd129fd9ec640c469b2"} Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.078014 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.188666 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-combined-ca-bundle\") pod \"c08f909e-e70c-4a83-ac67-89d707cef580\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.188750 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-credential-keys\") pod \"c08f909e-e70c-4a83-ac67-89d707cef580\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.188805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-fernet-keys\") pod \"c08f909e-e70c-4a83-ac67-89d707cef580\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.188895 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-scripts\") pod \"c08f909e-e70c-4a83-ac67-89d707cef580\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.188954 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-config-data\") pod \"c08f909e-e70c-4a83-ac67-89d707cef580\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.189047 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58hvf\" (UniqueName: \"kubernetes.io/projected/c08f909e-e70c-4a83-ac67-89d707cef580-kube-api-access-58hvf\") pod \"c08f909e-e70c-4a83-ac67-89d707cef580\" (UID: \"c08f909e-e70c-4a83-ac67-89d707cef580\") " Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.196226 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08f909e-e70c-4a83-ac67-89d707cef580-kube-api-access-58hvf" (OuterVolumeSpecName: "kube-api-access-58hvf") pod "c08f909e-e70c-4a83-ac67-89d707cef580" (UID: "c08f909e-e70c-4a83-ac67-89d707cef580"). InnerVolumeSpecName "kube-api-access-58hvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.197932 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c08f909e-e70c-4a83-ac67-89d707cef580" (UID: "c08f909e-e70c-4a83-ac67-89d707cef580"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.203812 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c08f909e-e70c-4a83-ac67-89d707cef580" (UID: "c08f909e-e70c-4a83-ac67-89d707cef580"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.209146 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-scripts" (OuterVolumeSpecName: "scripts") pod "c08f909e-e70c-4a83-ac67-89d707cef580" (UID: "c08f909e-e70c-4a83-ac67-89d707cef580"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.236025 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c08f909e-e70c-4a83-ac67-89d707cef580" (UID: "c08f909e-e70c-4a83-ac67-89d707cef580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.238161 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-config-data" (OuterVolumeSpecName: "config-data") pod "c08f909e-e70c-4a83-ac67-89d707cef580" (UID: "c08f909e-e70c-4a83-ac67-89d707cef580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.291581 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.291639 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.291656 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58hvf\" (UniqueName: \"kubernetes.io/projected/c08f909e-e70c-4a83-ac67-89d707cef580-kube-api-access-58hvf\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.291670 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.291684 4958 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.291700 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c08f909e-e70c-4a83-ac67-89d707cef580-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.660766 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b66b2" event={"ID":"c08f909e-e70c-4a83-ac67-89d707cef580","Type":"ContainerDied","Data":"214b9bd2aa43fed4fe33586a4492684c2fd31dffa12904ea5d5ded5a1249e272"} Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.660806 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="214b9bd2aa43fed4fe33586a4492684c2fd31dffa12904ea5d5ded5a1249e272" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.660886 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b66b2" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.743796 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b66b2"] Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.761272 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b66b2"] Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.811032 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08f909e-e70c-4a83-ac67-89d707cef580" path="/var/lib/kubelet/pods/c08f909e-e70c-4a83-ac67-89d707cef580/volumes" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.834304 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-flqzt"] Dec 01 11:33:39 crc kubenswrapper[4958]: E1201 11:33:39.834732 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08f909e-e70c-4a83-ac67-89d707cef580" containerName="keystone-bootstrap" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.834809 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08f909e-e70c-4a83-ac67-89d707cef580" containerName="keystone-bootstrap" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.835062 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08f909e-e70c-4a83-ac67-89d707cef580" containerName="keystone-bootstrap" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.835812 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.844045 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.844078 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.844166 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-flqzt"] Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.844209 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.846391 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qtvq7" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.907635 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8gvh\" (UniqueName: \"kubernetes.io/projected/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-kube-api-access-r8gvh\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.907702 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-fernet-keys\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.907753 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-combined-ca-bundle\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.908140 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-credential-keys\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.908229 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-config-data\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:39 crc kubenswrapper[4958]: I1201 11:33:39.908314 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-scripts\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.010475 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-combined-ca-bundle\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.010609 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-credential-keys\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.010641 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-config-data\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.010671 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-scripts\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.010779 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8gvh\" (UniqueName: \"kubernetes.io/projected/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-kube-api-access-r8gvh\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.010816 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-fernet-keys\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.014784 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-scripts\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.015583 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-credential-keys\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.016315 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-combined-ca-bundle\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.016637 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-config-data\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.017639 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-fernet-keys\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.032488 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8gvh\" (UniqueName: \"kubernetes.io/projected/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-kube-api-access-r8gvh\") pod \"keystone-bootstrap-flqzt\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.158434 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:40 crc kubenswrapper[4958]: I1201 11:33:40.753818 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-flqzt"] Dec 01 11:33:41 crc kubenswrapper[4958]: I1201 11:33:41.684962 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-flqzt" event={"ID":"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda","Type":"ContainerStarted","Data":"eac53bcf8d62ebfe124c5752f43fa899bbf88da6dfb347a889ef8083c7649fff"} Dec 01 11:33:41 crc kubenswrapper[4958]: I1201 11:33:41.686370 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-flqzt" event={"ID":"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda","Type":"ContainerStarted","Data":"fe8ec5207881161fcbbff309e3c77dd7ba14cb22b5b16345fd4d6d7b700690ef"} Dec 01 11:33:41 crc kubenswrapper[4958]: I1201 11:33:41.712745 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-flqzt" podStartSLOduration=2.71268657 podStartE2EDuration="2.71268657s" podCreationTimestamp="2025-12-01 11:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:41.706554597 +0000 UTC m=+5669.215343634" watchObservedRunningTime="2025-12-01 11:33:41.71268657 +0000 UTC m=+5669.221475637" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.109147 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.188377 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ffb5cc57c-hwcl8"] Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.188640 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" containerName="dnsmasq-dns" containerID="cri-o://a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba" gracePeriod=10 Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.698073 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.721504 4958 generic.go:334] "Generic (PLEG): container finished" podID="06419480-e9d9-4e57-9080-1e5aac676f05" containerID="a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba" exitCode=0 Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.721972 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.722786 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" event={"ID":"06419480-e9d9-4e57-9080-1e5aac676f05","Type":"ContainerDied","Data":"a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba"} Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.722826 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ffb5cc57c-hwcl8" event={"ID":"06419480-e9d9-4e57-9080-1e5aac676f05","Type":"ContainerDied","Data":"7b9f5966a137e811bec3f77622c895f573b0c944b7662e8b6a08a2db43f32611"} Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.722871 4958 scope.go:117] "RemoveContainer" containerID="a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.755450 4958 scope.go:117] "RemoveContainer" containerID="f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.778713 4958 scope.go:117] "RemoveContainer" containerID="a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba" Dec 01 11:33:43 crc kubenswrapper[4958]: E1201 11:33:43.779278 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba\": container with ID starting with a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba not found: ID does not exist" containerID="a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.779338 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba"} err="failed to get container status \"a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba\": rpc error: code = NotFound desc = could not find container \"a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba\": container with ID starting with a16bfb47a667a399c19240d153c6612fbb7212b04ca237dc20c07a5d72ff02ba not found: ID does not exist" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.779375 4958 scope.go:117] "RemoveContainer" containerID="f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb" Dec 01 11:33:43 crc kubenswrapper[4958]: E1201 11:33:43.780296 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb\": container with ID starting with f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb not found: ID does not exist" containerID="f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.780356 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb"} err="failed to get container status \"f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb\": rpc error: code = NotFound desc = could not find container \"f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb\": container with ID starting with f7933375cb2deb864ff1237210ccedec5cfe83c179e088c76cc0134d5d379ceb not found: ID does not exist" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.899123 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srhj8\" (UniqueName: \"kubernetes.io/projected/06419480-e9d9-4e57-9080-1e5aac676f05-kube-api-access-srhj8\") pod \"06419480-e9d9-4e57-9080-1e5aac676f05\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.899216 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-config\") pod \"06419480-e9d9-4e57-9080-1e5aac676f05\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.899291 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-nb\") pod \"06419480-e9d9-4e57-9080-1e5aac676f05\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.899333 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-sb\") pod \"06419480-e9d9-4e57-9080-1e5aac676f05\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.899406 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-dns-svc\") pod \"06419480-e9d9-4e57-9080-1e5aac676f05\" (UID: \"06419480-e9d9-4e57-9080-1e5aac676f05\") " Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.911340 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06419480-e9d9-4e57-9080-1e5aac676f05-kube-api-access-srhj8" (OuterVolumeSpecName: "kube-api-access-srhj8") pod "06419480-e9d9-4e57-9080-1e5aac676f05" (UID: "06419480-e9d9-4e57-9080-1e5aac676f05"). InnerVolumeSpecName "kube-api-access-srhj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.945677 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-config" (OuterVolumeSpecName: "config") pod "06419480-e9d9-4e57-9080-1e5aac676f05" (UID: "06419480-e9d9-4e57-9080-1e5aac676f05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.950862 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06419480-e9d9-4e57-9080-1e5aac676f05" (UID: "06419480-e9d9-4e57-9080-1e5aac676f05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.955497 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06419480-e9d9-4e57-9080-1e5aac676f05" (UID: "06419480-e9d9-4e57-9080-1e5aac676f05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:33:43 crc kubenswrapper[4958]: I1201 11:33:43.962256 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06419480-e9d9-4e57-9080-1e5aac676f05" (UID: "06419480-e9d9-4e57-9080-1e5aac676f05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.001959 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.002011 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.002022 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srhj8\" (UniqueName: \"kubernetes.io/projected/06419480-e9d9-4e57-9080-1e5aac676f05-kube-api-access-srhj8\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.002033 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.002043 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06419480-e9d9-4e57-9080-1e5aac676f05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.072611 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ffb5cc57c-hwcl8"] Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.077772 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ffb5cc57c-hwcl8"] Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.737022 4958 generic.go:334] "Generic (PLEG): container finished" podID="9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" containerID="eac53bcf8d62ebfe124c5752f43fa899bbf88da6dfb347a889ef8083c7649fff" exitCode=0 Dec 01 11:33:44 crc kubenswrapper[4958]: I1201 11:33:44.737122 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-flqzt" event={"ID":"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda","Type":"ContainerDied","Data":"eac53bcf8d62ebfe124c5752f43fa899bbf88da6dfb347a889ef8083c7649fff"} Dec 01 11:33:45 crc kubenswrapper[4958]: I1201 11:33:45.807355 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" path="/var/lib/kubelet/pods/06419480-e9d9-4e57-9080-1e5aac676f05/volumes" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.074000 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.248377 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-credential-keys\") pod \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.248432 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-combined-ca-bundle\") pod \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.248464 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-config-data\") pod \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.248483 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-scripts\") pod \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.248503 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8gvh\" (UniqueName: \"kubernetes.io/projected/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-kube-api-access-r8gvh\") pod \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.248549 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-fernet-keys\") pod \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\" (UID: \"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda\") " Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.256109 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" (UID: "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.256165 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-scripts" (OuterVolumeSpecName: "scripts") pod "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" (UID: "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.256447 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-kube-api-access-r8gvh" (OuterVolumeSpecName: "kube-api-access-r8gvh") pod "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" (UID: "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda"). InnerVolumeSpecName "kube-api-access-r8gvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.258216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" (UID: "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.275355 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" (UID: "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.278915 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-config-data" (OuterVolumeSpecName: "config-data") pod "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" (UID: "9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.350584 4958 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.350971 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.350992 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.351006 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.351028 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8gvh\" (UniqueName: \"kubernetes.io/projected/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-kube-api-access-r8gvh\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.351044 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.758999 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-flqzt" event={"ID":"9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda","Type":"ContainerDied","Data":"fe8ec5207881161fcbbff309e3c77dd7ba14cb22b5b16345fd4d6d7b700690ef"} Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.759077 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe8ec5207881161fcbbff309e3c77dd7ba14cb22b5b16345fd4d6d7b700690ef" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.759093 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-flqzt" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.882701 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7dc7b798fb-n8x4t"] Dec 01 11:33:46 crc kubenswrapper[4958]: E1201 11:33:46.883090 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" containerName="keystone-bootstrap" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.883103 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" containerName="keystone-bootstrap" Dec 01 11:33:46 crc kubenswrapper[4958]: E1201 11:33:46.883133 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" containerName="dnsmasq-dns" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.883140 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" containerName="dnsmasq-dns" Dec 01 11:33:46 crc kubenswrapper[4958]: E1201 11:33:46.883157 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" containerName="init" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.883165 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" containerName="init" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.883356 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="06419480-e9d9-4e57-9080-1e5aac676f05" containerName="dnsmasq-dns" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.883390 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" containerName="keystone-bootstrap" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.884134 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.892499 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.895460 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.895495 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.896301 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qtvq7" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.935022 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7dc7b798fb-n8x4t"] Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.970966 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-fernet-keys\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.971039 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-combined-ca-bundle\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.971083 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-config-data\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.971176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4854l\" (UniqueName: \"kubernetes.io/projected/30f22ff2-e8d7-42de-beb4-1e598d294527-kube-api-access-4854l\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.971202 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-scripts\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:46 crc kubenswrapper[4958]: I1201 11:33:46.971240 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-credential-keys\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.075020 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4854l\" (UniqueName: \"kubernetes.io/projected/30f22ff2-e8d7-42de-beb4-1e598d294527-kube-api-access-4854l\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.075087 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-scripts\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.075131 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-credential-keys\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.075179 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-fernet-keys\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.075210 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-combined-ca-bundle\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.075246 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-config-data\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.096265 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-fernet-keys\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.096332 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-credential-keys\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.096524 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-scripts\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.097303 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-combined-ca-bundle\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.104535 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f22ff2-e8d7-42de-beb4-1e598d294527-config-data\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.109920 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4854l\" (UniqueName: \"kubernetes.io/projected/30f22ff2-e8d7-42de-beb4-1e598d294527-kube-api-access-4854l\") pod \"keystone-7dc7b798fb-n8x4t\" (UID: \"30f22ff2-e8d7-42de-beb4-1e598d294527\") " pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.209539 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.656529 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7dc7b798fb-n8x4t"] Dec 01 11:33:47 crc kubenswrapper[4958]: I1201 11:33:47.770299 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7dc7b798fb-n8x4t" event={"ID":"30f22ff2-e8d7-42de-beb4-1e598d294527","Type":"ContainerStarted","Data":"4574046b4f144640d62e463d42d0ac5b2dca4ae5e42f783c98cbb8551f99d866"} Dec 01 11:33:48 crc kubenswrapper[4958]: I1201 11:33:48.792900 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7dc7b798fb-n8x4t" event={"ID":"30f22ff2-e8d7-42de-beb4-1e598d294527","Type":"ContainerStarted","Data":"7ad3aa57da79c6b8873984452b29e1c507254fb55fc6a6ca7fcf131d46cb7b79"} Dec 01 11:33:48 crc kubenswrapper[4958]: I1201 11:33:48.793162 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:33:48 crc kubenswrapper[4958]: I1201 11:33:48.833382 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7dc7b798fb-n8x4t" podStartSLOduration=2.833355357 podStartE2EDuration="2.833355357s" podCreationTimestamp="2025-12-01 11:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:33:48.823157758 +0000 UTC m=+5676.331946825" watchObservedRunningTime="2025-12-01 11:33:48.833355357 +0000 UTC m=+5676.342144434" Dec 01 11:33:58 crc kubenswrapper[4958]: I1201 11:33:58.210510 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:33:58 crc kubenswrapper[4958]: I1201 11:33:58.211245 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:34:18 crc kubenswrapper[4958]: I1201 11:34:18.785120 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7dc7b798fb-n8x4t" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.438682 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.441474 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.444298 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.444678 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wpgcb" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.444680 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.471823 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.545703 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.545784 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.545834 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgng\" (UniqueName: \"kubernetes.io/projected/c556ea7c-505e-4575-a0cf-f313ec8058b1-kube-api-access-mqgng\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.647998 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.648114 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgng\" (UniqueName: \"kubernetes.io/projected/c556ea7c-505e-4575-a0cf-f313ec8058b1-kube-api-access-mqgng\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.648240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.649756 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.657790 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config-secret\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.672014 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgng\" (UniqueName: \"kubernetes.io/projected/c556ea7c-505e-4575-a0cf-f313ec8058b1-kube-api-access-mqgng\") pod \"openstackclient\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " pod="openstack/openstackclient" Dec 01 11:34:21 crc kubenswrapper[4958]: I1201 11:34:21.787350 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:34:22 crc kubenswrapper[4958]: I1201 11:34:22.068486 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 11:34:22 crc kubenswrapper[4958]: I1201 11:34:22.151967 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c556ea7c-505e-4575-a0cf-f313ec8058b1","Type":"ContainerStarted","Data":"63f4e940c78bf3afb62cbea024ff5b16cff2ef31695cd61afb2d9c70b5891ff0"} Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.169936 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c556ea7c-505e-4575-a0cf-f313ec8058b1","Type":"ContainerStarted","Data":"4a4ea7fe5561c436cc39172b8f7674a704670cf922c5ebf27c16cc15a6f80bb2"} Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.196220 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.196189223 podStartE2EDuration="2.196189223s" podCreationTimestamp="2025-12-01 11:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:34:23.19042358 +0000 UTC m=+5710.699212657" watchObservedRunningTime="2025-12-01 11:34:23.196189223 +0000 UTC m=+5710.704978290" Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.344469 4958 scope.go:117] "RemoveContainer" containerID="57fa0953e730b318253e6ea0dbed922e8cfabca9448c2737109a69fc38c63272" Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.372294 4958 scope.go:117] "RemoveContainer" containerID="567c43ead8823c632514df8ad47fe4240adb6595cf8348f208335e3f5a183de8" Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.431942 4958 scope.go:117] "RemoveContainer" containerID="e730031dbfd93660790bbe7b721be5f143e63e92d9567c9ca62a92d16284f732" Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.479208 4958 scope.go:117] "RemoveContainer" containerID="41ce6b8f69be673c83334ebd5433d62b818c62f481264232c1638a350c339a6d" Dec 01 11:34:23 crc kubenswrapper[4958]: I1201 11:34:23.507395 4958 scope.go:117] "RemoveContainer" containerID="ffa06db896bf51dc3b6eff02ad8700cf3855ea2c79d24d6d8a7c22f7d3d00603" Dec 01 11:34:28 crc kubenswrapper[4958]: I1201 11:34:28.210432 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:34:28 crc kubenswrapper[4958]: I1201 11:34:28.211313 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:34:28 crc kubenswrapper[4958]: I1201 11:34:28.211394 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:34:28 crc kubenswrapper[4958]: I1201 11:34:28.212787 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:34:28 crc kubenswrapper[4958]: I1201 11:34:28.212961 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" gracePeriod=600 Dec 01 11:34:28 crc kubenswrapper[4958]: E1201 11:34:28.341775 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:34:29 crc kubenswrapper[4958]: I1201 11:34:29.264447 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" exitCode=0 Dec 01 11:34:29 crc kubenswrapper[4958]: I1201 11:34:29.265108 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc"} Dec 01 11:34:29 crc kubenswrapper[4958]: I1201 11:34:29.265188 4958 scope.go:117] "RemoveContainer" containerID="991fdb4be35530b97fb80defdced702e6d810a5e7b47e601ff99ede2a4cc30ae" Dec 01 11:34:29 crc kubenswrapper[4958]: I1201 11:34:29.267232 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:34:29 crc kubenswrapper[4958]: E1201 11:34:29.268510 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:34:40 crc kubenswrapper[4958]: I1201 11:34:40.798738 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:34:40 crc kubenswrapper[4958]: E1201 11:34:40.800023 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:34:54 crc kubenswrapper[4958]: I1201 11:34:54.797433 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:34:54 crc kubenswrapper[4958]: E1201 11:34:54.798232 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:35:08 crc kubenswrapper[4958]: I1201 11:35:08.797487 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:35:08 crc kubenswrapper[4958]: E1201 11:35:08.798360 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:35:22 crc kubenswrapper[4958]: I1201 11:35:22.798887 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:35:22 crc kubenswrapper[4958]: E1201 11:35:22.799751 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:35:33 crc kubenswrapper[4958]: I1201 11:35:33.804229 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:35:33 crc kubenswrapper[4958]: E1201 11:35:33.806976 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:35:47 crc kubenswrapper[4958]: I1201 11:35:47.801254 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:35:47 crc kubenswrapper[4958]: E1201 11:35:47.802085 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:36:02 crc kubenswrapper[4958]: I1201 11:36:02.798119 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:36:02 crc kubenswrapper[4958]: E1201 11:36:02.799055 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:36:03 crc kubenswrapper[4958]: I1201 11:36:03.727293 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-q5vbv"] Dec 01 11:36:03 crc kubenswrapper[4958]: I1201 11:36:03.733532 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:03 crc kubenswrapper[4958]: I1201 11:36:03.746546 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-q5vbv"] Dec 01 11:36:03 crc kubenswrapper[4958]: I1201 11:36:03.833381 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqnhd\" (UniqueName: \"kubernetes.io/projected/c35348c1-21cb-436d-92a2-be0b35bae865-kube-api-access-wqnhd\") pod \"barbican-db-create-q5vbv\" (UID: \"c35348c1-21cb-436d-92a2-be0b35bae865\") " pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:03 crc kubenswrapper[4958]: I1201 11:36:03.934381 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqnhd\" (UniqueName: \"kubernetes.io/projected/c35348c1-21cb-436d-92a2-be0b35bae865-kube-api-access-wqnhd\") pod \"barbican-db-create-q5vbv\" (UID: \"c35348c1-21cb-436d-92a2-be0b35bae865\") " pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:03 crc kubenswrapper[4958]: I1201 11:36:03.951520 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqnhd\" (UniqueName: \"kubernetes.io/projected/c35348c1-21cb-436d-92a2-be0b35bae865-kube-api-access-wqnhd\") pod \"barbican-db-create-q5vbv\" (UID: \"c35348c1-21cb-436d-92a2-be0b35bae865\") " pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:04 crc kubenswrapper[4958]: I1201 11:36:04.058093 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:04 crc kubenswrapper[4958]: I1201 11:36:04.583070 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-q5vbv"] Dec 01 11:36:05 crc kubenswrapper[4958]: I1201 11:36:05.288255 4958 generic.go:334] "Generic (PLEG): container finished" podID="c35348c1-21cb-436d-92a2-be0b35bae865" containerID="02fe415f1ca925a8048ef4a1fb987d78f51fc8ff77b6c8445f0ae7f44cc517af" exitCode=0 Dec 01 11:36:05 crc kubenswrapper[4958]: I1201 11:36:05.288478 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q5vbv" event={"ID":"c35348c1-21cb-436d-92a2-be0b35bae865","Type":"ContainerDied","Data":"02fe415f1ca925a8048ef4a1fb987d78f51fc8ff77b6c8445f0ae7f44cc517af"} Dec 01 11:36:05 crc kubenswrapper[4958]: I1201 11:36:05.288582 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q5vbv" event={"ID":"c35348c1-21cb-436d-92a2-be0b35bae865","Type":"ContainerStarted","Data":"a5c3db064b8de781bf72e29c1d0962ae6f6f3a6561c1592684751386fcfc4bda"} Dec 01 11:36:06 crc kubenswrapper[4958]: I1201 11:36:06.748373 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:06 crc kubenswrapper[4958]: I1201 11:36:06.786216 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqnhd\" (UniqueName: \"kubernetes.io/projected/c35348c1-21cb-436d-92a2-be0b35bae865-kube-api-access-wqnhd\") pod \"c35348c1-21cb-436d-92a2-be0b35bae865\" (UID: \"c35348c1-21cb-436d-92a2-be0b35bae865\") " Dec 01 11:36:06 crc kubenswrapper[4958]: I1201 11:36:06.890107 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35348c1-21cb-436d-92a2-be0b35bae865-kube-api-access-wqnhd" (OuterVolumeSpecName: "kube-api-access-wqnhd") pod "c35348c1-21cb-436d-92a2-be0b35bae865" (UID: "c35348c1-21cb-436d-92a2-be0b35bae865"). InnerVolumeSpecName "kube-api-access-wqnhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:36:06 crc kubenswrapper[4958]: I1201 11:36:06.891768 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqnhd\" (UniqueName: \"kubernetes.io/projected/c35348c1-21cb-436d-92a2-be0b35bae865-kube-api-access-wqnhd\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:07 crc kubenswrapper[4958]: I1201 11:36:07.308729 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q5vbv" event={"ID":"c35348c1-21cb-436d-92a2-be0b35bae865","Type":"ContainerDied","Data":"a5c3db064b8de781bf72e29c1d0962ae6f6f3a6561c1592684751386fcfc4bda"} Dec 01 11:36:07 crc kubenswrapper[4958]: I1201 11:36:07.308798 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c3db064b8de781bf72e29c1d0962ae6f6f3a6561c1592684751386fcfc4bda" Dec 01 11:36:07 crc kubenswrapper[4958]: I1201 11:36:07.308810 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q5vbv" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.662037 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9a62-account-create-sqhzj"] Dec 01 11:36:13 crc kubenswrapper[4958]: E1201 11:36:13.663004 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35348c1-21cb-436d-92a2-be0b35bae865" containerName="mariadb-database-create" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.663021 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35348c1-21cb-436d-92a2-be0b35bae865" containerName="mariadb-database-create" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.663280 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35348c1-21cb-436d-92a2-be0b35bae865" containerName="mariadb-database-create" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.664135 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.673133 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgr8v\" (UniqueName: \"kubernetes.io/projected/96bf6ac8-2df1-48b4-a3b7-286728a26999-kube-api-access-fgr8v\") pod \"barbican-9a62-account-create-sqhzj\" (UID: \"96bf6ac8-2df1-48b4-a3b7-286728a26999\") " pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.674162 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9a62-account-create-sqhzj"] Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.677096 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.774996 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgr8v\" (UniqueName: \"kubernetes.io/projected/96bf6ac8-2df1-48b4-a3b7-286728a26999-kube-api-access-fgr8v\") pod \"barbican-9a62-account-create-sqhzj\" (UID: \"96bf6ac8-2df1-48b4-a3b7-286728a26999\") " pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.796556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgr8v\" (UniqueName: \"kubernetes.io/projected/96bf6ac8-2df1-48b4-a3b7-286728a26999-kube-api-access-fgr8v\") pod \"barbican-9a62-account-create-sqhzj\" (UID: \"96bf6ac8-2df1-48b4-a3b7-286728a26999\") " pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:13 crc kubenswrapper[4958]: I1201 11:36:13.997816 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:14 crc kubenswrapper[4958]: I1201 11:36:14.523029 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9a62-account-create-sqhzj"] Dec 01 11:36:14 crc kubenswrapper[4958]: I1201 11:36:14.540201 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 11:36:14 crc kubenswrapper[4958]: I1201 11:36:14.798125 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:36:14 crc kubenswrapper[4958]: E1201 11:36:14.798620 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:36:15 crc kubenswrapper[4958]: I1201 11:36:15.384611 4958 generic.go:334] "Generic (PLEG): container finished" podID="96bf6ac8-2df1-48b4-a3b7-286728a26999" containerID="0508062a0bcde9e5e856144828a1560b634ee7bc468d6ebd1bd2cd020befc620" exitCode=0 Dec 01 11:36:15 crc kubenswrapper[4958]: I1201 11:36:15.384695 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9a62-account-create-sqhzj" event={"ID":"96bf6ac8-2df1-48b4-a3b7-286728a26999","Type":"ContainerDied","Data":"0508062a0bcde9e5e856144828a1560b634ee7bc468d6ebd1bd2cd020befc620"} Dec 01 11:36:15 crc kubenswrapper[4958]: I1201 11:36:15.384792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9a62-account-create-sqhzj" event={"ID":"96bf6ac8-2df1-48b4-a3b7-286728a26999","Type":"ContainerStarted","Data":"04a3d7f5f4488f5b25e5064bb8043d10207c1196d2c1958521a7234e86618c08"} Dec 01 11:36:16 crc kubenswrapper[4958]: I1201 11:36:16.854763 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:16 crc kubenswrapper[4958]: I1201 11:36:16.863669 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgr8v\" (UniqueName: \"kubernetes.io/projected/96bf6ac8-2df1-48b4-a3b7-286728a26999-kube-api-access-fgr8v\") pod \"96bf6ac8-2df1-48b4-a3b7-286728a26999\" (UID: \"96bf6ac8-2df1-48b4-a3b7-286728a26999\") " Dec 01 11:36:16 crc kubenswrapper[4958]: I1201 11:36:16.873679 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bf6ac8-2df1-48b4-a3b7-286728a26999-kube-api-access-fgr8v" (OuterVolumeSpecName: "kube-api-access-fgr8v") pod "96bf6ac8-2df1-48b4-a3b7-286728a26999" (UID: "96bf6ac8-2df1-48b4-a3b7-286728a26999"). InnerVolumeSpecName "kube-api-access-fgr8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:36:16 crc kubenswrapper[4958]: I1201 11:36:16.966589 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgr8v\" (UniqueName: \"kubernetes.io/projected/96bf6ac8-2df1-48b4-a3b7-286728a26999-kube-api-access-fgr8v\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:17 crc kubenswrapper[4958]: I1201 11:36:17.404247 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9a62-account-create-sqhzj" event={"ID":"96bf6ac8-2df1-48b4-a3b7-286728a26999","Type":"ContainerDied","Data":"04a3d7f5f4488f5b25e5064bb8043d10207c1196d2c1958521a7234e86618c08"} Dec 01 11:36:17 crc kubenswrapper[4958]: I1201 11:36:17.404536 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04a3d7f5f4488f5b25e5064bb8043d10207c1196d2c1958521a7234e86618c08" Dec 01 11:36:17 crc kubenswrapper[4958]: I1201 11:36:17.404350 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9a62-account-create-sqhzj" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.035795 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mbbfx"] Dec 01 11:36:19 crc kubenswrapper[4958]: E1201 11:36:19.036261 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bf6ac8-2df1-48b4-a3b7-286728a26999" containerName="mariadb-account-create" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.036281 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bf6ac8-2df1-48b4-a3b7-286728a26999" containerName="mariadb-account-create" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.036540 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bf6ac8-2df1-48b4-a3b7-286728a26999" containerName="mariadb-account-create" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.037352 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.039766 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4vr6l" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.040126 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.051076 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mbbfx"] Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.109237 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-db-sync-config-data\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.109317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lks4k\" (UniqueName: \"kubernetes.io/projected/e65fdf38-96f7-4d0c-939a-46f91892bc94-kube-api-access-lks4k\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.109483 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-combined-ca-bundle\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.211282 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-combined-ca-bundle\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.211645 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-db-sync-config-data\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.211869 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lks4k\" (UniqueName: \"kubernetes.io/projected/e65fdf38-96f7-4d0c-939a-46f91892bc94-kube-api-access-lks4k\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.216244 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-db-sync-config-data\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.216539 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-combined-ca-bundle\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.233222 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lks4k\" (UniqueName: \"kubernetes.io/projected/e65fdf38-96f7-4d0c-939a-46f91892bc94-kube-api-access-lks4k\") pod \"barbican-db-sync-mbbfx\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.377751 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:19 crc kubenswrapper[4958]: I1201 11:36:19.835210 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mbbfx"] Dec 01 11:36:20 crc kubenswrapper[4958]: I1201 11:36:20.442441 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mbbfx" event={"ID":"e65fdf38-96f7-4d0c-939a-46f91892bc94","Type":"ContainerStarted","Data":"47ae2568e6f41867471eeb4e41f5dbb8d85e695c479d2a909abdfb4607f00d8d"} Dec 01 11:36:20 crc kubenswrapper[4958]: I1201 11:36:20.442720 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mbbfx" event={"ID":"e65fdf38-96f7-4d0c-939a-46f91892bc94","Type":"ContainerStarted","Data":"e6e284856d66f3bf96ba081f3cd858ab27ea3f700d7da63fadf3feea880b0bd5"} Dec 01 11:36:20 crc kubenswrapper[4958]: I1201 11:36:20.459267 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mbbfx" podStartSLOduration=1.4592393289999999 podStartE2EDuration="1.459239329s" podCreationTimestamp="2025-12-01 11:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:36:20.458768516 +0000 UTC m=+5827.967557553" watchObservedRunningTime="2025-12-01 11:36:20.459239329 +0000 UTC m=+5827.968028366" Dec 01 11:36:21 crc kubenswrapper[4958]: I1201 11:36:21.452717 4958 generic.go:334] "Generic (PLEG): container finished" podID="e65fdf38-96f7-4d0c-939a-46f91892bc94" containerID="47ae2568e6f41867471eeb4e41f5dbb8d85e695c479d2a909abdfb4607f00d8d" exitCode=0 Dec 01 11:36:21 crc kubenswrapper[4958]: I1201 11:36:21.452767 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mbbfx" event={"ID":"e65fdf38-96f7-4d0c-939a-46f91892bc94","Type":"ContainerDied","Data":"47ae2568e6f41867471eeb4e41f5dbb8d85e695c479d2a909abdfb4607f00d8d"} Dec 01 11:36:22 crc kubenswrapper[4958]: I1201 11:36:22.826353 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:22 crc kubenswrapper[4958]: I1201 11:36:22.978594 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lks4k\" (UniqueName: \"kubernetes.io/projected/e65fdf38-96f7-4d0c-939a-46f91892bc94-kube-api-access-lks4k\") pod \"e65fdf38-96f7-4d0c-939a-46f91892bc94\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " Dec 01 11:36:22 crc kubenswrapper[4958]: I1201 11:36:22.978702 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-db-sync-config-data\") pod \"e65fdf38-96f7-4d0c-939a-46f91892bc94\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " Dec 01 11:36:22 crc kubenswrapper[4958]: I1201 11:36:22.978785 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-combined-ca-bundle\") pod \"e65fdf38-96f7-4d0c-939a-46f91892bc94\" (UID: \"e65fdf38-96f7-4d0c-939a-46f91892bc94\") " Dec 01 11:36:22 crc kubenswrapper[4958]: I1201 11:36:22.985436 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65fdf38-96f7-4d0c-939a-46f91892bc94-kube-api-access-lks4k" (OuterVolumeSpecName: "kube-api-access-lks4k") pod "e65fdf38-96f7-4d0c-939a-46f91892bc94" (UID: "e65fdf38-96f7-4d0c-939a-46f91892bc94"). InnerVolumeSpecName "kube-api-access-lks4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:36:22 crc kubenswrapper[4958]: I1201 11:36:22.991970 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e65fdf38-96f7-4d0c-939a-46f91892bc94" (UID: "e65fdf38-96f7-4d0c-939a-46f91892bc94"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.012025 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e65fdf38-96f7-4d0c-939a-46f91892bc94" (UID: "e65fdf38-96f7-4d0c-939a-46f91892bc94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.080618 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lks4k\" (UniqueName: \"kubernetes.io/projected/e65fdf38-96f7-4d0c-939a-46f91892bc94-kube-api-access-lks4k\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.080657 4958 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.080665 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65fdf38-96f7-4d0c-939a-46f91892bc94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.479403 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mbbfx" event={"ID":"e65fdf38-96f7-4d0c-939a-46f91892bc94","Type":"ContainerDied","Data":"e6e284856d66f3bf96ba081f3cd858ab27ea3f700d7da63fadf3feea880b0bd5"} Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.479491 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e284856d66f3bf96ba081f3cd858ab27ea3f700d7da63fadf3feea880b0bd5" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.479602 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mbbfx" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.739400 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-644f5cb446-jk7bl"] Dec 01 11:36:23 crc kubenswrapper[4958]: E1201 11:36:23.739811 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65fdf38-96f7-4d0c-939a-46f91892bc94" containerName="barbican-db-sync" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.739827 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65fdf38-96f7-4d0c-939a-46f91892bc94" containerName="barbican-db-sync" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.740019 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65fdf38-96f7-4d0c-939a-46f91892bc94" containerName="barbican-db-sync" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.741021 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.745524 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.745825 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.746000 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4vr6l" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.759219 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-c5d7654b7-786gr"] Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.760818 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.763263 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.783076 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-644f5cb446-jk7bl"] Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.874411 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c5d7654b7-786gr"] Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923070 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-config-data\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923160 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-combined-ca-bundle\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923188 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4sf\" (UniqueName: \"kubernetes.io/projected/f2d1f043-186b-408d-af6c-f22c0b4e5171-kube-api-access-mq4sf\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923222 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-config-data-custom\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923241 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2d1f043-186b-408d-af6c-f22c0b4e5171-logs\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923322 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-config-data-custom\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923362 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-config-data\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923407 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-logs\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923443 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-combined-ca-bundle\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.923483 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkjr\" (UniqueName: \"kubernetes.io/projected/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-kube-api-access-xzkjr\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.951268 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56f67bddd7-jz8qt"] Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.954749 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:23 crc kubenswrapper[4958]: I1201 11:36:23.974661 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f67bddd7-jz8qt"] Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.010782 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7785bcc974-dndp9"] Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.016361 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026128 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-combined-ca-bundle\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026175 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4sf\" (UniqueName: \"kubernetes.io/projected/f2d1f043-186b-408d-af6c-f22c0b4e5171-kube-api-access-mq4sf\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026202 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-config-data-custom\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2d1f043-186b-408d-af6c-f22c0b4e5171-logs\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026265 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-config-data-custom\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026292 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-config-data\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026327 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-logs\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026354 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-combined-ca-bundle\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026388 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkjr\" (UniqueName: \"kubernetes.io/projected/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-kube-api-access-xzkjr\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.026412 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-config-data\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.029887 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2d1f043-186b-408d-af6c-f22c0b4e5171-logs\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.029954 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7785bcc974-dndp9"] Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.030136 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-logs\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.030620 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.055829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-config-data\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.057946 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-combined-ca-bundle\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.059506 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-config-data-custom\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.060133 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-combined-ca-bundle\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.060545 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-config-data\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.068698 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2d1f043-186b-408d-af6c-f22c0b4e5171-config-data-custom\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.075606 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkjr\" (UniqueName: \"kubernetes.io/projected/4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f-kube-api-access-xzkjr\") pod \"barbican-worker-c5d7654b7-786gr\" (UID: \"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f\") " pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.076513 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4sf\" (UniqueName: \"kubernetes.io/projected/f2d1f043-186b-408d-af6c-f22c0b4e5171-kube-api-access-mq4sf\") pod \"barbican-keystone-listener-644f5cb446-jk7bl\" (UID: \"f2d1f043-186b-408d-af6c-f22c0b4e5171\") " pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.112564 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c5d7654b7-786gr" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.128786 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-combined-ca-bundle\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.128898 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-config-data-custom\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.128977 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-logs\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129132 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-config\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129366 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxfxl\" (UniqueName: \"kubernetes.io/projected/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-kube-api-access-zxfxl\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129450 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-dns-svc\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129496 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjbmj\" (UniqueName: \"kubernetes.io/projected/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-kube-api-access-zjbmj\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129604 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-sb\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129687 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-config-data\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.129702 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-nb\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231133 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxfxl\" (UniqueName: \"kubernetes.io/projected/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-kube-api-access-zxfxl\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231178 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-dns-svc\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231206 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjbmj\" (UniqueName: \"kubernetes.io/projected/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-kube-api-access-zjbmj\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231252 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-sb\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231287 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-config-data\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231304 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-nb\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231339 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-combined-ca-bundle\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231359 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-config-data-custom\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231387 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-logs\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.231438 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-config\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.232435 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-config\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.234143 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-nb\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.234348 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-sb\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.234466 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-dns-svc\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.241715 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-logs\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.242522 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-combined-ca-bundle\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.249742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-config-data-custom\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.251010 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-config-data\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.256728 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxfxl\" (UniqueName: \"kubernetes.io/projected/a261ecd4-5929-49b5-b2b5-e2d3f7e4683e-kube-api-access-zxfxl\") pod \"barbican-api-7785bcc974-dndp9\" (UID: \"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e\") " pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.258664 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjbmj\" (UniqueName: \"kubernetes.io/projected/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-kube-api-access-zjbmj\") pod \"dnsmasq-dns-56f67bddd7-jz8qt\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.274603 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.363654 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.424006 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.627666 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c5d7654b7-786gr"] Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.796979 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f67bddd7-jz8qt"] Dec 01 11:36:24 crc kubenswrapper[4958]: I1201 11:36:24.933751 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-644f5cb446-jk7bl"] Dec 01 11:36:24 crc kubenswrapper[4958]: W1201 11:36:24.933887 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d1f043_186b_408d_af6c_f22c0b4e5171.slice/crio-78e1e5f05508047c2acf8cd77e8c481f0e99895a0b9a70df71de0027fae04eb2 WatchSource:0}: Error finding container 78e1e5f05508047c2acf8cd77e8c481f0e99895a0b9a70df71de0027fae04eb2: Status 404 returned error can't find the container with id 78e1e5f05508047c2acf8cd77e8c481f0e99895a0b9a70df71de0027fae04eb2 Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.025495 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7785bcc974-dndp9"] Dec 01 11:36:25 crc kubenswrapper[4958]: W1201 11:36:25.090800 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda261ecd4_5929_49b5_b2b5_e2d3f7e4683e.slice/crio-3be3eb5aa996447fa46c19eb816ba0097100d39bc4fcdecc0eeda296d67b8fc1 WatchSource:0}: Error finding container 3be3eb5aa996447fa46c19eb816ba0097100d39bc4fcdecc0eeda296d67b8fc1: Status 404 returned error can't find the container with id 3be3eb5aa996447fa46c19eb816ba0097100d39bc4fcdecc0eeda296d67b8fc1 Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.502891 4958 generic.go:334] "Generic (PLEG): container finished" podID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerID="d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40" exitCode=0 Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.503014 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" event={"ID":"cbd8bc53-cfed-443b-b838-7c5493e8e9b5","Type":"ContainerDied","Data":"d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.503397 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" event={"ID":"cbd8bc53-cfed-443b-b838-7c5493e8e9b5","Type":"ContainerStarted","Data":"36811d82b02d6ecf8f5abc5b379794f8caf77195338be5880fa34e9dca53b3ab"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.506975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7785bcc974-dndp9" event={"ID":"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e","Type":"ContainerStarted","Data":"75d1076679f4036db1ad4fdfad70e76ce72d925112e772463edd02014f612763"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.507026 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7785bcc974-dndp9" event={"ID":"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e","Type":"ContainerStarted","Data":"15c6b2d0221003c06a6141e5c3e13c17b3041b1b2cfea7887a4e61975714d859"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.507037 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7785bcc974-dndp9" event={"ID":"a261ecd4-5929-49b5-b2b5-e2d3f7e4683e","Type":"ContainerStarted","Data":"3be3eb5aa996447fa46c19eb816ba0097100d39bc4fcdecc0eeda296d67b8fc1"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.507141 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.512179 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" event={"ID":"f2d1f043-186b-408d-af6c-f22c0b4e5171","Type":"ContainerStarted","Data":"e056e85853a10a0ea66197f537bf839cb847a8fb6b54b41a1be7154dd4dc28a7"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.512236 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" event={"ID":"f2d1f043-186b-408d-af6c-f22c0b4e5171","Type":"ContainerStarted","Data":"fe5da2b309e36f0c106b9c5de637735616c10bc7b7b9c4fe1630e8cd15a42d19"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.512248 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" event={"ID":"f2d1f043-186b-408d-af6c-f22c0b4e5171","Type":"ContainerStarted","Data":"78e1e5f05508047c2acf8cd77e8c481f0e99895a0b9a70df71de0027fae04eb2"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.513665 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c5d7654b7-786gr" event={"ID":"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f","Type":"ContainerStarted","Data":"ef1338b703c132b97cf2fdcb2d6cc9189eca4fdf865c7f66571e7b45ae446b49"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.513705 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c5d7654b7-786gr" event={"ID":"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f","Type":"ContainerStarted","Data":"b7ed53a53bf5466ee510929b52caf213fe3b4df3a97bd63a843c524605bdef80"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.513720 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c5d7654b7-786gr" event={"ID":"4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f","Type":"ContainerStarted","Data":"0c0eff1da5826bfb18169f20aeb7c80f3e346332aa16bf4f3caaf6900ae25f56"} Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.549790 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7785bcc974-dndp9" podStartSLOduration=2.549766442 podStartE2EDuration="2.549766442s" podCreationTimestamp="2025-12-01 11:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:36:25.537466964 +0000 UTC m=+5833.046256001" watchObservedRunningTime="2025-12-01 11:36:25.549766442 +0000 UTC m=+5833.058555479" Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.579113 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-c5d7654b7-786gr" podStartSLOduration=2.579070262 podStartE2EDuration="2.579070262s" podCreationTimestamp="2025-12-01 11:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:36:25.574304477 +0000 UTC m=+5833.083093524" watchObservedRunningTime="2025-12-01 11:36:25.579070262 +0000 UTC m=+5833.087859289" Dec 01 11:36:25 crc kubenswrapper[4958]: I1201 11:36:25.603263 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-644f5cb446-jk7bl" podStartSLOduration=2.603248737 podStartE2EDuration="2.603248737s" podCreationTimestamp="2025-12-01 11:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:36:25.600604162 +0000 UTC m=+5833.109393199" watchObservedRunningTime="2025-12-01 11:36:25.603248737 +0000 UTC m=+5833.112037764" Dec 01 11:36:26 crc kubenswrapper[4958]: I1201 11:36:26.526038 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" event={"ID":"cbd8bc53-cfed-443b-b838-7c5493e8e9b5","Type":"ContainerStarted","Data":"414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62"} Dec 01 11:36:26 crc kubenswrapper[4958]: I1201 11:36:26.526591 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:26 crc kubenswrapper[4958]: I1201 11:36:26.526819 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:26 crc kubenswrapper[4958]: I1201 11:36:26.552600 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" podStartSLOduration=3.552571892 podStartE2EDuration="3.552571892s" podCreationTimestamp="2025-12-01 11:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:36:26.544797861 +0000 UTC m=+5834.053586978" watchObservedRunningTime="2025-12-01 11:36:26.552571892 +0000 UTC m=+5834.061360969" Dec 01 11:36:28 crc kubenswrapper[4958]: I1201 11:36:28.797631 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:36:28 crc kubenswrapper[4958]: E1201 11:36:28.798376 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:36:34 crc kubenswrapper[4958]: I1201 11:36:34.277117 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:36:34 crc kubenswrapper[4958]: I1201 11:36:34.372337 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779b6bf5fc-xbklh"] Dec 01 11:36:34 crc kubenswrapper[4958]: I1201 11:36:34.372655 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" podUID="f063f400-6953-430b-8035-7c648b20d92e" containerName="dnsmasq-dns" containerID="cri-o://b1a6d7044cd3639bea39d4c8a8ae8da85a5c3edc7277a63e841e896c81e7db36" gracePeriod=10 Dec 01 11:36:34 crc kubenswrapper[4958]: I1201 11:36:34.608760 4958 generic.go:334] "Generic (PLEG): container finished" podID="f063f400-6953-430b-8035-7c648b20d92e" containerID="b1a6d7044cd3639bea39d4c8a8ae8da85a5c3edc7277a63e841e896c81e7db36" exitCode=0 Dec 01 11:36:34 crc kubenswrapper[4958]: I1201 11:36:34.608811 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" event={"ID":"f063f400-6953-430b-8035-7c648b20d92e","Type":"ContainerDied","Data":"b1a6d7044cd3639bea39d4c8a8ae8da85a5c3edc7277a63e841e896c81e7db36"} Dec 01 11:36:34 crc kubenswrapper[4958]: I1201 11:36:34.903077 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.019729 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-config\") pod \"f063f400-6953-430b-8035-7c648b20d92e\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.019777 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-nb\") pod \"f063f400-6953-430b-8035-7c648b20d92e\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.019975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nhh5\" (UniqueName: \"kubernetes.io/projected/f063f400-6953-430b-8035-7c648b20d92e-kube-api-access-6nhh5\") pod \"f063f400-6953-430b-8035-7c648b20d92e\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.020026 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-sb\") pod \"f063f400-6953-430b-8035-7c648b20d92e\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.020047 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-dns-svc\") pod \"f063f400-6953-430b-8035-7c648b20d92e\" (UID: \"f063f400-6953-430b-8035-7c648b20d92e\") " Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.025816 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f063f400-6953-430b-8035-7c648b20d92e-kube-api-access-6nhh5" (OuterVolumeSpecName: "kube-api-access-6nhh5") pod "f063f400-6953-430b-8035-7c648b20d92e" (UID: "f063f400-6953-430b-8035-7c648b20d92e"). InnerVolumeSpecName "kube-api-access-6nhh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.071050 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f063f400-6953-430b-8035-7c648b20d92e" (UID: "f063f400-6953-430b-8035-7c648b20d92e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.077186 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f063f400-6953-430b-8035-7c648b20d92e" (UID: "f063f400-6953-430b-8035-7c648b20d92e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.077965 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f063f400-6953-430b-8035-7c648b20d92e" (UID: "f063f400-6953-430b-8035-7c648b20d92e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.081476 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-config" (OuterVolumeSpecName: "config") pod "f063f400-6953-430b-8035-7c648b20d92e" (UID: "f063f400-6953-430b-8035-7c648b20d92e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.122367 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.122411 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.122423 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.122434 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f063f400-6953-430b-8035-7c648b20d92e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.122449 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nhh5\" (UniqueName: \"kubernetes.io/projected/f063f400-6953-430b-8035-7c648b20d92e-kube-api-access-6nhh5\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.633577 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" event={"ID":"f063f400-6953-430b-8035-7c648b20d92e","Type":"ContainerDied","Data":"4b67cc2718daff9aa87937b198d5d55420fb8010d5200d71a4b7ec2df73cbdd4"} Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.633634 4958 scope.go:117] "RemoveContainer" containerID="b1a6d7044cd3639bea39d4c8a8ae8da85a5c3edc7277a63e841e896c81e7db36" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.633742 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779b6bf5fc-xbklh" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.699300 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779b6bf5fc-xbklh"] Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.700936 4958 scope.go:117] "RemoveContainer" containerID="f6133af47d16a674549c8df0995206eda43be7d05b2a267e9b658ea3c9105912" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.710547 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-779b6bf5fc-xbklh"] Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.812995 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f063f400-6953-430b-8035-7c648b20d92e" path="/var/lib/kubelet/pods/f063f400-6953-430b-8035-7c648b20d92e/volumes" Dec 01 11:36:35 crc kubenswrapper[4958]: I1201 11:36:35.980189 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:36 crc kubenswrapper[4958]: I1201 11:36:36.074400 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7785bcc974-dndp9" Dec 01 11:36:39 crc kubenswrapper[4958]: I1201 11:36:39.798449 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:36:39 crc kubenswrapper[4958]: E1201 11:36:39.799188 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.441914 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nrgc9"] Dec 01 11:36:48 crc kubenswrapper[4958]: E1201 11:36:48.443007 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f063f400-6953-430b-8035-7c648b20d92e" containerName="init" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.443025 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f063f400-6953-430b-8035-7c648b20d92e" containerName="init" Dec 01 11:36:48 crc kubenswrapper[4958]: E1201 11:36:48.443058 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f063f400-6953-430b-8035-7c648b20d92e" containerName="dnsmasq-dns" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.443066 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f063f400-6953-430b-8035-7c648b20d92e" containerName="dnsmasq-dns" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.443274 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f063f400-6953-430b-8035-7c648b20d92e" containerName="dnsmasq-dns" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.444838 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.466279 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nrgc9"] Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.475938 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bncw\" (UniqueName: \"kubernetes.io/projected/9a0a2476-9374-4bbb-aba1-d6861a36cc56-kube-api-access-8bncw\") pod \"neutron-db-create-nrgc9\" (UID: \"9a0a2476-9374-4bbb-aba1-d6861a36cc56\") " pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.577285 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bncw\" (UniqueName: \"kubernetes.io/projected/9a0a2476-9374-4bbb-aba1-d6861a36cc56-kube-api-access-8bncw\") pod \"neutron-db-create-nrgc9\" (UID: \"9a0a2476-9374-4bbb-aba1-d6861a36cc56\") " pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.601561 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bncw\" (UniqueName: \"kubernetes.io/projected/9a0a2476-9374-4bbb-aba1-d6861a36cc56-kube-api-access-8bncw\") pod \"neutron-db-create-nrgc9\" (UID: \"9a0a2476-9374-4bbb-aba1-d6861a36cc56\") " pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:48 crc kubenswrapper[4958]: I1201 11:36:48.779279 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:49 crc kubenswrapper[4958]: I1201 11:36:49.370792 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nrgc9"] Dec 01 11:36:49 crc kubenswrapper[4958]: I1201 11:36:49.766348 4958 generic.go:334] "Generic (PLEG): container finished" podID="9a0a2476-9374-4bbb-aba1-d6861a36cc56" containerID="620b27c1173d320928855d927b9cf9b145577183ba351a15a60e7bfbb0578c05" exitCode=0 Dec 01 11:36:49 crc kubenswrapper[4958]: I1201 11:36:49.766537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nrgc9" event={"ID":"9a0a2476-9374-4bbb-aba1-d6861a36cc56","Type":"ContainerDied","Data":"620b27c1173d320928855d927b9cf9b145577183ba351a15a60e7bfbb0578c05"} Dec 01 11:36:49 crc kubenswrapper[4958]: I1201 11:36:49.766658 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nrgc9" event={"ID":"9a0a2476-9374-4bbb-aba1-d6861a36cc56","Type":"ContainerStarted","Data":"bfa273d821d3051789dc66cd388b994e02df1e3c5ec0de9101590172c184387d"} Dec 01 11:36:50 crc kubenswrapper[4958]: I1201 11:36:50.797780 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:36:50 crc kubenswrapper[4958]: E1201 11:36:50.798186 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.140349 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.244503 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bncw\" (UniqueName: \"kubernetes.io/projected/9a0a2476-9374-4bbb-aba1-d6861a36cc56-kube-api-access-8bncw\") pod \"9a0a2476-9374-4bbb-aba1-d6861a36cc56\" (UID: \"9a0a2476-9374-4bbb-aba1-d6861a36cc56\") " Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.257085 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0a2476-9374-4bbb-aba1-d6861a36cc56-kube-api-access-8bncw" (OuterVolumeSpecName: "kube-api-access-8bncw") pod "9a0a2476-9374-4bbb-aba1-d6861a36cc56" (UID: "9a0a2476-9374-4bbb-aba1-d6861a36cc56"). InnerVolumeSpecName "kube-api-access-8bncw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.346927 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bncw\" (UniqueName: \"kubernetes.io/projected/9a0a2476-9374-4bbb-aba1-d6861a36cc56-kube-api-access-8bncw\") on node \"crc\" DevicePath \"\"" Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.785066 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nrgc9" event={"ID":"9a0a2476-9374-4bbb-aba1-d6861a36cc56","Type":"ContainerDied","Data":"bfa273d821d3051789dc66cd388b994e02df1e3c5ec0de9101590172c184387d"} Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.785365 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa273d821d3051789dc66cd388b994e02df1e3c5ec0de9101590172c184387d" Dec 01 11:36:51 crc kubenswrapper[4958]: I1201 11:36:51.785127 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nrgc9" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.588231 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5efc-account-create-bxbx6"] Dec 01 11:36:58 crc kubenswrapper[4958]: E1201 11:36:58.590060 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0a2476-9374-4bbb-aba1-d6861a36cc56" containerName="mariadb-database-create" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.590094 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0a2476-9374-4bbb-aba1-d6861a36cc56" containerName="mariadb-database-create" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.590289 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0a2476-9374-4bbb-aba1-d6861a36cc56" containerName="mariadb-database-create" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.590940 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.592701 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.600114 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5efc-account-create-bxbx6"] Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.691680 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4f9v\" (UniqueName: \"kubernetes.io/projected/13f7171b-238f-414a-9ba5-162eaf91718a-kube-api-access-s4f9v\") pod \"neutron-5efc-account-create-bxbx6\" (UID: \"13f7171b-238f-414a-9ba5-162eaf91718a\") " pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.792877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4f9v\" (UniqueName: \"kubernetes.io/projected/13f7171b-238f-414a-9ba5-162eaf91718a-kube-api-access-s4f9v\") pod \"neutron-5efc-account-create-bxbx6\" (UID: \"13f7171b-238f-414a-9ba5-162eaf91718a\") " pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.818323 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4f9v\" (UniqueName: \"kubernetes.io/projected/13f7171b-238f-414a-9ba5-162eaf91718a-kube-api-access-s4f9v\") pod \"neutron-5efc-account-create-bxbx6\" (UID: \"13f7171b-238f-414a-9ba5-162eaf91718a\") " pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:36:58 crc kubenswrapper[4958]: I1201 11:36:58.907895 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:36:59 crc kubenswrapper[4958]: I1201 11:36:59.429567 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5efc-account-create-bxbx6"] Dec 01 11:36:59 crc kubenswrapper[4958]: I1201 11:36:59.876806 4958 generic.go:334] "Generic (PLEG): container finished" podID="13f7171b-238f-414a-9ba5-162eaf91718a" containerID="5c11d9bb17982367b09aa71a4b0c77616aead9aa990179649735d4e11a07cc74" exitCode=0 Dec 01 11:36:59 crc kubenswrapper[4958]: I1201 11:36:59.877196 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5efc-account-create-bxbx6" event={"ID":"13f7171b-238f-414a-9ba5-162eaf91718a","Type":"ContainerDied","Data":"5c11d9bb17982367b09aa71a4b0c77616aead9aa990179649735d4e11a07cc74"} Dec 01 11:36:59 crc kubenswrapper[4958]: I1201 11:36:59.877231 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5efc-account-create-bxbx6" event={"ID":"13f7171b-238f-414a-9ba5-162eaf91718a","Type":"ContainerStarted","Data":"0288437d7977ea2063b9a4aab2fd99f3aea20084aa0130642cefc884d99326f6"} Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.244071 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.443335 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4f9v\" (UniqueName: \"kubernetes.io/projected/13f7171b-238f-414a-9ba5-162eaf91718a-kube-api-access-s4f9v\") pod \"13f7171b-238f-414a-9ba5-162eaf91718a\" (UID: \"13f7171b-238f-414a-9ba5-162eaf91718a\") " Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.453196 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f7171b-238f-414a-9ba5-162eaf91718a-kube-api-access-s4f9v" (OuterVolumeSpecName: "kube-api-access-s4f9v") pod "13f7171b-238f-414a-9ba5-162eaf91718a" (UID: "13f7171b-238f-414a-9ba5-162eaf91718a"). InnerVolumeSpecName "kube-api-access-s4f9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.545758 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4f9v\" (UniqueName: \"kubernetes.io/projected/13f7171b-238f-414a-9ba5-162eaf91718a-kube-api-access-s4f9v\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.901913 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5efc-account-create-bxbx6" event={"ID":"13f7171b-238f-414a-9ba5-162eaf91718a","Type":"ContainerDied","Data":"0288437d7977ea2063b9a4aab2fd99f3aea20084aa0130642cefc884d99326f6"} Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.902001 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0288437d7977ea2063b9a4aab2fd99f3aea20084aa0130642cefc884d99326f6" Dec 01 11:37:01 crc kubenswrapper[4958]: I1201 11:37:01.902054 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5efc-account-create-bxbx6" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.863796 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fnwgw"] Dec 01 11:37:03 crc kubenswrapper[4958]: E1201 11:37:03.864674 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f7171b-238f-414a-9ba5-162eaf91718a" containerName="mariadb-account-create" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.864697 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f7171b-238f-414a-9ba5-162eaf91718a" containerName="mariadb-account-create" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.864965 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f7171b-238f-414a-9ba5-162eaf91718a" containerName="mariadb-account-create" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.865717 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.867760 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9x4hq" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.869075 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.870109 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.880014 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fnwgw"] Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.996336 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kkpf\" (UniqueName: \"kubernetes.io/projected/207b3a13-aef6-4b53-9856-773957da8b9f-kube-api-access-5kkpf\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.996457 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-config\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:03 crc kubenswrapper[4958]: I1201 11:37:03.996512 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-combined-ca-bundle\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.098760 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-combined-ca-bundle\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.099129 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kkpf\" (UniqueName: \"kubernetes.io/projected/207b3a13-aef6-4b53-9856-773957da8b9f-kube-api-access-5kkpf\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.099899 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-config\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.109485 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-combined-ca-bundle\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.109773 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-config\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.130375 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kkpf\" (UniqueName: \"kubernetes.io/projected/207b3a13-aef6-4b53-9856-773957da8b9f-kube-api-access-5kkpf\") pod \"neutron-db-sync-fnwgw\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.198587 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.760939 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fnwgw"] Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.930715 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnwgw" event={"ID":"207b3a13-aef6-4b53-9856-773957da8b9f","Type":"ContainerStarted","Data":"5f248fdbef07e9f0ce51ead3f8d3b21ca110a971f4e8e967c03f88c55d641c41"} Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.930772 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnwgw" event={"ID":"207b3a13-aef6-4b53-9856-773957da8b9f","Type":"ContainerStarted","Data":"0305de07ce134f378a536399023eadd98095016025212040056bd91edc4d6019"} Dec 01 11:37:04 crc kubenswrapper[4958]: I1201 11:37:04.953460 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fnwgw" podStartSLOduration=1.953424064 podStartE2EDuration="1.953424064s" podCreationTimestamp="2025-12-01 11:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:37:04.952512629 +0000 UTC m=+5872.461301666" watchObservedRunningTime="2025-12-01 11:37:04.953424064 +0000 UTC m=+5872.462213131" Dec 01 11:37:05 crc kubenswrapper[4958]: I1201 11:37:05.798912 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:37:05 crc kubenswrapper[4958]: E1201 11:37:05.799354 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:37:08 crc kubenswrapper[4958]: I1201 11:37:08.984182 4958 generic.go:334] "Generic (PLEG): container finished" podID="207b3a13-aef6-4b53-9856-773957da8b9f" containerID="5f248fdbef07e9f0ce51ead3f8d3b21ca110a971f4e8e967c03f88c55d641c41" exitCode=0 Dec 01 11:37:08 crc kubenswrapper[4958]: I1201 11:37:08.984274 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnwgw" event={"ID":"207b3a13-aef6-4b53-9856-773957da8b9f","Type":"ContainerDied","Data":"5f248fdbef07e9f0ce51ead3f8d3b21ca110a971f4e8e967c03f88c55d641c41"} Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.503433 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.641391 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-combined-ca-bundle\") pod \"207b3a13-aef6-4b53-9856-773957da8b9f\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.641924 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-config\") pod \"207b3a13-aef6-4b53-9856-773957da8b9f\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.641964 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kkpf\" (UniqueName: \"kubernetes.io/projected/207b3a13-aef6-4b53-9856-773957da8b9f-kube-api-access-5kkpf\") pod \"207b3a13-aef6-4b53-9856-773957da8b9f\" (UID: \"207b3a13-aef6-4b53-9856-773957da8b9f\") " Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.649694 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/207b3a13-aef6-4b53-9856-773957da8b9f-kube-api-access-5kkpf" (OuterVolumeSpecName: "kube-api-access-5kkpf") pod "207b3a13-aef6-4b53-9856-773957da8b9f" (UID: "207b3a13-aef6-4b53-9856-773957da8b9f"). InnerVolumeSpecName "kube-api-access-5kkpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.680979 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-config" (OuterVolumeSpecName: "config") pod "207b3a13-aef6-4b53-9856-773957da8b9f" (UID: "207b3a13-aef6-4b53-9856-773957da8b9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.685386 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "207b3a13-aef6-4b53-9856-773957da8b9f" (UID: "207b3a13-aef6-4b53-9856-773957da8b9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.744000 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.744043 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/207b3a13-aef6-4b53-9856-773957da8b9f-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:10 crc kubenswrapper[4958]: I1201 11:37:10.744054 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kkpf\" (UniqueName: \"kubernetes.io/projected/207b3a13-aef6-4b53-9856-773957da8b9f-kube-api-access-5kkpf\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.055293 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnwgw" event={"ID":"207b3a13-aef6-4b53-9856-773957da8b9f","Type":"ContainerDied","Data":"0305de07ce134f378a536399023eadd98095016025212040056bd91edc4d6019"} Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.055343 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0305de07ce134f378a536399023eadd98095016025212040056bd91edc4d6019" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.055376 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnwgw" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.169655 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f6ccf868c-dvfgm"] Dec 01 11:37:11 crc kubenswrapper[4958]: E1201 11:37:11.174936 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207b3a13-aef6-4b53-9856-773957da8b9f" containerName="neutron-db-sync" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.174974 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="207b3a13-aef6-4b53-9856-773957da8b9f" containerName="neutron-db-sync" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.175179 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="207b3a13-aef6-4b53-9856-773957da8b9f" containerName="neutron-db-sync" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.176225 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.183044 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f6ccf868c-dvfgm"] Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.258683 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-nb\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.258734 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-config\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.258771 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-dns-svc\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.258836 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6pw\" (UniqueName: \"kubernetes.io/projected/468cc986-0b2c-4965-9e74-6d0b828a5001-kube-api-access-5t6pw\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.258909 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-sb\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.320318 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-645fd5bd8c-6tbc4"] Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.321892 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.325162 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9x4hq" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.325284 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.330585 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.336723 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-645fd5bd8c-6tbc4"] Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361734 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-config\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361792 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-config\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361822 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-dns-svc\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361894 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6pw\" (UniqueName: \"kubernetes.io/projected/468cc986-0b2c-4965-9e74-6d0b828a5001-kube-api-access-5t6pw\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361923 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-sb\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361955 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-httpd-config\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.361986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjblh\" (UniqueName: \"kubernetes.io/projected/a111ae96-b43a-475d-9364-31049f6ab7fc-kube-api-access-xjblh\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.362030 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-nb\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.362045 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-combined-ca-bundle\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.363035 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-config\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.363120 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-dns-svc\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.363519 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-nb\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.363559 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-sb\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.382179 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6pw\" (UniqueName: \"kubernetes.io/projected/468cc986-0b2c-4965-9e74-6d0b828a5001-kube-api-access-5t6pw\") pod \"dnsmasq-dns-5f6ccf868c-dvfgm\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.464045 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-httpd-config\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.465408 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjblh\" (UniqueName: \"kubernetes.io/projected/a111ae96-b43a-475d-9364-31049f6ab7fc-kube-api-access-xjblh\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.465558 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-combined-ca-bundle\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.465701 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-config\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.469066 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-config\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.469425 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-httpd-config\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.483050 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a111ae96-b43a-475d-9364-31049f6ab7fc-combined-ca-bundle\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.486263 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjblh\" (UniqueName: \"kubernetes.io/projected/a111ae96-b43a-475d-9364-31049f6ab7fc-kube-api-access-xjblh\") pod \"neutron-645fd5bd8c-6tbc4\" (UID: \"a111ae96-b43a-475d-9364-31049f6ab7fc\") " pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.552253 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:11 crc kubenswrapper[4958]: I1201 11:37:11.642435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:12 crc kubenswrapper[4958]: I1201 11:37:12.026603 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f6ccf868c-dvfgm"] Dec 01 11:37:12 crc kubenswrapper[4958]: I1201 11:37:12.073919 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" event={"ID":"468cc986-0b2c-4965-9e74-6d0b828a5001","Type":"ContainerStarted","Data":"4eea2cfcbd97ed5d2299b524aae1bd4d7397fadae11fd6bf4464b9526f54f9d4"} Dec 01 11:37:12 crc kubenswrapper[4958]: I1201 11:37:12.247096 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-645fd5bd8c-6tbc4"] Dec 01 11:37:12 crc kubenswrapper[4958]: W1201 11:37:12.286226 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda111ae96_b43a_475d_9364_31049f6ab7fc.slice/crio-3e07cfc31eec0c70d4731cbc7599eccf7612381dc55afc7bd3b888f9d00e30ff WatchSource:0}: Error finding container 3e07cfc31eec0c70d4731cbc7599eccf7612381dc55afc7bd3b888f9d00e30ff: Status 404 returned error can't find the container with id 3e07cfc31eec0c70d4731cbc7599eccf7612381dc55afc7bd3b888f9d00e30ff Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.088715 4958 generic.go:334] "Generic (PLEG): container finished" podID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerID="d623411e61fe5a380262bc7dbb07f939e024bb06503a30af78500a00bd17d4a9" exitCode=0 Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.089013 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" event={"ID":"468cc986-0b2c-4965-9e74-6d0b828a5001","Type":"ContainerDied","Data":"d623411e61fe5a380262bc7dbb07f939e024bb06503a30af78500a00bd17d4a9"} Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.099389 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645fd5bd8c-6tbc4" event={"ID":"a111ae96-b43a-475d-9364-31049f6ab7fc","Type":"ContainerStarted","Data":"c1ffac0de6edd4461f2d5c060f03478b9981dedffa213a3710ca84beea13d33b"} Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.099463 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645fd5bd8c-6tbc4" event={"ID":"a111ae96-b43a-475d-9364-31049f6ab7fc","Type":"ContainerStarted","Data":"333e6b00c2fb1c7ba5475f5ae216c268b43a2750b69d5d7f34111febfef47897"} Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.099488 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-645fd5bd8c-6tbc4" event={"ID":"a111ae96-b43a-475d-9364-31049f6ab7fc","Type":"ContainerStarted","Data":"3e07cfc31eec0c70d4731cbc7599eccf7612381dc55afc7bd3b888f9d00e30ff"} Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.100103 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:13 crc kubenswrapper[4958]: I1201 11:37:13.137185 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-645fd5bd8c-6tbc4" podStartSLOduration=2.137163868 podStartE2EDuration="2.137163868s" podCreationTimestamp="2025-12-01 11:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:37:13.135586303 +0000 UTC m=+5880.644375350" watchObservedRunningTime="2025-12-01 11:37:13.137163868 +0000 UTC m=+5880.645952915" Dec 01 11:37:14 crc kubenswrapper[4958]: I1201 11:37:14.111417 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" event={"ID":"468cc986-0b2c-4965-9e74-6d0b828a5001","Type":"ContainerStarted","Data":"e398332e9682cf3fe35ab5e59943ed9934ba5cbc0b9d60d5b61a949e68e77e6e"} Dec 01 11:37:14 crc kubenswrapper[4958]: I1201 11:37:14.112109 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:14 crc kubenswrapper[4958]: I1201 11:37:14.144954 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" podStartSLOduration=3.144931768 podStartE2EDuration="3.144931768s" podCreationTimestamp="2025-12-01 11:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:37:14.1319597 +0000 UTC m=+5881.640748747" watchObservedRunningTime="2025-12-01 11:37:14.144931768 +0000 UTC m=+5881.653720805" Dec 01 11:37:20 crc kubenswrapper[4958]: I1201 11:37:20.797621 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:37:20 crc kubenswrapper[4958]: E1201 11:37:20.800266 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:37:21 crc kubenswrapper[4958]: I1201 11:37:21.554138 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:37:21 crc kubenswrapper[4958]: I1201 11:37:21.656969 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f67bddd7-jz8qt"] Dec 01 11:37:21 crc kubenswrapper[4958]: I1201 11:37:21.657512 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerName="dnsmasq-dns" containerID="cri-o://414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62" gracePeriod=10 Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.109130 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.201628 4958 generic.go:334] "Generic (PLEG): container finished" podID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerID="414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62" exitCode=0 Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.201694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" event={"ID":"cbd8bc53-cfed-443b-b838-7c5493e8e9b5","Type":"ContainerDied","Data":"414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62"} Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.201740 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" event={"ID":"cbd8bc53-cfed-443b-b838-7c5493e8e9b5","Type":"ContainerDied","Data":"36811d82b02d6ecf8f5abc5b379794f8caf77195338be5880fa34e9dca53b3ab"} Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.201768 4958 scope.go:117] "RemoveContainer" containerID="414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.202076 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f67bddd7-jz8qt" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.227280 4958 scope.go:117] "RemoveContainer" containerID="d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.247500 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-nb\") pod \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.247554 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-sb\") pod \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.247619 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjbmj\" (UniqueName: \"kubernetes.io/projected/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-kube-api-access-zjbmj\") pod \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.247665 4958 scope.go:117] "RemoveContainer" containerID="414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.247688 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-dns-svc\") pod \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.247746 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-config\") pod \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\" (UID: \"cbd8bc53-cfed-443b-b838-7c5493e8e9b5\") " Dec 01 11:37:22 crc kubenswrapper[4958]: E1201 11:37:22.248693 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62\": container with ID starting with 414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62 not found: ID does not exist" containerID="414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.248731 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62"} err="failed to get container status \"414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62\": rpc error: code = NotFound desc = could not find container \"414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62\": container with ID starting with 414d8932026915d151266f22691f7e7cc35cb8ecab688de2d2face12eea55f62 not found: ID does not exist" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.248757 4958 scope.go:117] "RemoveContainer" containerID="d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40" Dec 01 11:37:22 crc kubenswrapper[4958]: E1201 11:37:22.252470 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40\": container with ID starting with d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40 not found: ID does not exist" containerID="d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.252609 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40"} err="failed to get container status \"d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40\": rpc error: code = NotFound desc = could not find container \"d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40\": container with ID starting with d79c7aaf8d6415b940cbcde68ae0da25d13e11d37e452cbed528fa774b505e40 not found: ID does not exist" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.253214 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-kube-api-access-zjbmj" (OuterVolumeSpecName: "kube-api-access-zjbmj") pod "cbd8bc53-cfed-443b-b838-7c5493e8e9b5" (UID: "cbd8bc53-cfed-443b-b838-7c5493e8e9b5"). InnerVolumeSpecName "kube-api-access-zjbmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.290477 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cbd8bc53-cfed-443b-b838-7c5493e8e9b5" (UID: "cbd8bc53-cfed-443b-b838-7c5493e8e9b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.297617 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbd8bc53-cfed-443b-b838-7c5493e8e9b5" (UID: "cbd8bc53-cfed-443b-b838-7c5493e8e9b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.312402 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-config" (OuterVolumeSpecName: "config") pod "cbd8bc53-cfed-443b-b838-7c5493e8e9b5" (UID: "cbd8bc53-cfed-443b-b838-7c5493e8e9b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.313815 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cbd8bc53-cfed-443b-b838-7c5493e8e9b5" (UID: "cbd8bc53-cfed-443b-b838-7c5493e8e9b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.350235 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.350270 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.350280 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjbmj\" (UniqueName: \"kubernetes.io/projected/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-kube-api-access-zjbmj\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.350291 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.350300 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd8bc53-cfed-443b-b838-7c5493e8e9b5-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.532615 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f67bddd7-jz8qt"] Dec 01 11:37:22 crc kubenswrapper[4958]: I1201 11:37:22.539679 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56f67bddd7-jz8qt"] Dec 01 11:37:23 crc kubenswrapper[4958]: I1201 11:37:23.813024 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" path="/var/lib/kubelet/pods/cbd8bc53-cfed-443b-b838-7c5493e8e9b5/volumes" Dec 01 11:37:32 crc kubenswrapper[4958]: I1201 11:37:32.798270 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:37:32 crc kubenswrapper[4958]: E1201 11:37:32.799278 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:37:41 crc kubenswrapper[4958]: I1201 11:37:41.657076 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-645fd5bd8c-6tbc4" Dec 01 11:37:45 crc kubenswrapper[4958]: I1201 11:37:45.798033 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:37:45 crc kubenswrapper[4958]: E1201 11:37:45.799636 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.557510 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8wzxc"] Dec 01 11:37:49 crc kubenswrapper[4958]: E1201 11:37:49.558253 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerName="init" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.558268 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerName="init" Dec 01 11:37:49 crc kubenswrapper[4958]: E1201 11:37:49.558289 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerName="dnsmasq-dns" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.558295 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerName="dnsmasq-dns" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.558461 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd8bc53-cfed-443b-b838-7c5493e8e9b5" containerName="dnsmasq-dns" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.559179 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.569270 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8wzxc"] Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.664865 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dtgl\" (UniqueName: \"kubernetes.io/projected/931deaef-d313-49ba-9fde-1d7eeb131d09-kube-api-access-8dtgl\") pod \"glance-db-create-8wzxc\" (UID: \"931deaef-d313-49ba-9fde-1d7eeb131d09\") " pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.767240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dtgl\" (UniqueName: \"kubernetes.io/projected/931deaef-d313-49ba-9fde-1d7eeb131d09-kube-api-access-8dtgl\") pod \"glance-db-create-8wzxc\" (UID: \"931deaef-d313-49ba-9fde-1d7eeb131d09\") " pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.803675 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dtgl\" (UniqueName: \"kubernetes.io/projected/931deaef-d313-49ba-9fde-1d7eeb131d09-kube-api-access-8dtgl\") pod \"glance-db-create-8wzxc\" (UID: \"931deaef-d313-49ba-9fde-1d7eeb131d09\") " pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:49 crc kubenswrapper[4958]: I1201 11:37:49.883030 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:50 crc kubenswrapper[4958]: I1201 11:37:50.189817 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8wzxc"] Dec 01 11:37:50 crc kubenswrapper[4958]: I1201 11:37:50.572183 4958 generic.go:334] "Generic (PLEG): container finished" podID="931deaef-d313-49ba-9fde-1d7eeb131d09" containerID="67ac8dbad7bcf79f5ad5c4e97f9bc12154b981b5a1effdba6b9fb8ef6a99c173" exitCode=0 Dec 01 11:37:50 crc kubenswrapper[4958]: I1201 11:37:50.572318 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wzxc" event={"ID":"931deaef-d313-49ba-9fde-1d7eeb131d09","Type":"ContainerDied","Data":"67ac8dbad7bcf79f5ad5c4e97f9bc12154b981b5a1effdba6b9fb8ef6a99c173"} Dec 01 11:37:50 crc kubenswrapper[4958]: I1201 11:37:50.572486 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wzxc" event={"ID":"931deaef-d313-49ba-9fde-1d7eeb131d09","Type":"ContainerStarted","Data":"5f0e2b7c3a365d4f327d424de301daf8665964783ced5028e3111e204fe98f8c"} Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.023762 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.072951 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dtgl\" (UniqueName: \"kubernetes.io/projected/931deaef-d313-49ba-9fde-1d7eeb131d09-kube-api-access-8dtgl\") pod \"931deaef-d313-49ba-9fde-1d7eeb131d09\" (UID: \"931deaef-d313-49ba-9fde-1d7eeb131d09\") " Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.078768 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931deaef-d313-49ba-9fde-1d7eeb131d09-kube-api-access-8dtgl" (OuterVolumeSpecName: "kube-api-access-8dtgl") pod "931deaef-d313-49ba-9fde-1d7eeb131d09" (UID: "931deaef-d313-49ba-9fde-1d7eeb131d09"). InnerVolumeSpecName "kube-api-access-8dtgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.175692 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dtgl\" (UniqueName: \"kubernetes.io/projected/931deaef-d313-49ba-9fde-1d7eeb131d09-kube-api-access-8dtgl\") on node \"crc\" DevicePath \"\"" Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.598750 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8wzxc" event={"ID":"931deaef-d313-49ba-9fde-1d7eeb131d09","Type":"ContainerDied","Data":"5f0e2b7c3a365d4f327d424de301daf8665964783ced5028e3111e204fe98f8c"} Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.598827 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f0e2b7c3a365d4f327d424de301daf8665964783ced5028e3111e204fe98f8c" Dec 01 11:37:52 crc kubenswrapper[4958]: I1201 11:37:52.598790 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8wzxc" Dec 01 11:37:56 crc kubenswrapper[4958]: I1201 11:37:56.803979 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:37:56 crc kubenswrapper[4958]: E1201 11:37:56.808780 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.697546 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-94c0-account-create-kcdk5"] Dec 01 11:37:59 crc kubenswrapper[4958]: E1201 11:37:59.698574 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931deaef-d313-49ba-9fde-1d7eeb131d09" containerName="mariadb-database-create" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.698606 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="931deaef-d313-49ba-9fde-1d7eeb131d09" containerName="mariadb-database-create" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.699080 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="931deaef-d313-49ba-9fde-1d7eeb131d09" containerName="mariadb-database-create" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.700904 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.705790 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.714470 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-94c0-account-create-kcdk5"] Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.791334 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmgq\" (UniqueName: \"kubernetes.io/projected/564ff6dd-d3c4-48d0-b416-0b3caee09947-kube-api-access-ndmgq\") pod \"glance-94c0-account-create-kcdk5\" (UID: \"564ff6dd-d3c4-48d0-b416-0b3caee09947\") " pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.893453 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmgq\" (UniqueName: \"kubernetes.io/projected/564ff6dd-d3c4-48d0-b416-0b3caee09947-kube-api-access-ndmgq\") pod \"glance-94c0-account-create-kcdk5\" (UID: \"564ff6dd-d3c4-48d0-b416-0b3caee09947\") " pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:37:59 crc kubenswrapper[4958]: I1201 11:37:59.920449 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmgq\" (UniqueName: \"kubernetes.io/projected/564ff6dd-d3c4-48d0-b416-0b3caee09947-kube-api-access-ndmgq\") pod \"glance-94c0-account-create-kcdk5\" (UID: \"564ff6dd-d3c4-48d0-b416-0b3caee09947\") " pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:38:00 crc kubenswrapper[4958]: I1201 11:38:00.040146 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:38:00 crc kubenswrapper[4958]: I1201 11:38:00.587603 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-94c0-account-create-kcdk5"] Dec 01 11:38:00 crc kubenswrapper[4958]: W1201 11:38:00.599047 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod564ff6dd_d3c4_48d0_b416_0b3caee09947.slice/crio-07f2a4d50d4e35d4b5e240ea39d558e8c737b2bf47217af54c79c1167e139008 WatchSource:0}: Error finding container 07f2a4d50d4e35d4b5e240ea39d558e8c737b2bf47217af54c79c1167e139008: Status 404 returned error can't find the container with id 07f2a4d50d4e35d4b5e240ea39d558e8c737b2bf47217af54c79c1167e139008 Dec 01 11:38:00 crc kubenswrapper[4958]: I1201 11:38:00.731588 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-94c0-account-create-kcdk5" event={"ID":"564ff6dd-d3c4-48d0-b416-0b3caee09947","Type":"ContainerStarted","Data":"07f2a4d50d4e35d4b5e240ea39d558e8c737b2bf47217af54c79c1167e139008"} Dec 01 11:38:01 crc kubenswrapper[4958]: I1201 11:38:01.748447 4958 generic.go:334] "Generic (PLEG): container finished" podID="564ff6dd-d3c4-48d0-b416-0b3caee09947" containerID="aeb17b3447c9cc5280dccaecab40d5c30cef30ed3c76fea37040b51f2e7022e0" exitCode=0 Dec 01 11:38:01 crc kubenswrapper[4958]: I1201 11:38:01.748563 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-94c0-account-create-kcdk5" event={"ID":"564ff6dd-d3c4-48d0-b416-0b3caee09947","Type":"ContainerDied","Data":"aeb17b3447c9cc5280dccaecab40d5c30cef30ed3c76fea37040b51f2e7022e0"} Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.181802 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.269417 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndmgq\" (UniqueName: \"kubernetes.io/projected/564ff6dd-d3c4-48d0-b416-0b3caee09947-kube-api-access-ndmgq\") pod \"564ff6dd-d3c4-48d0-b416-0b3caee09947\" (UID: \"564ff6dd-d3c4-48d0-b416-0b3caee09947\") " Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.276209 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564ff6dd-d3c4-48d0-b416-0b3caee09947-kube-api-access-ndmgq" (OuterVolumeSpecName: "kube-api-access-ndmgq") pod "564ff6dd-d3c4-48d0-b416-0b3caee09947" (UID: "564ff6dd-d3c4-48d0-b416-0b3caee09947"). InnerVolumeSpecName "kube-api-access-ndmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.372474 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndmgq\" (UniqueName: \"kubernetes.io/projected/564ff6dd-d3c4-48d0-b416-0b3caee09947-kube-api-access-ndmgq\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.774886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-94c0-account-create-kcdk5" event={"ID":"564ff6dd-d3c4-48d0-b416-0b3caee09947","Type":"ContainerDied","Data":"07f2a4d50d4e35d4b5e240ea39d558e8c737b2bf47217af54c79c1167e139008"} Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.775003 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-94c0-account-create-kcdk5" Dec 01 11:38:03 crc kubenswrapper[4958]: I1201 11:38:03.774991 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07f2a4d50d4e35d4b5e240ea39d558e8c737b2bf47217af54c79c1167e139008" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.153509 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnnhv"] Dec 01 11:38:04 crc kubenswrapper[4958]: E1201 11:38:04.154934 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564ff6dd-d3c4-48d0-b416-0b3caee09947" containerName="mariadb-account-create" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.155186 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="564ff6dd-d3c4-48d0-b416-0b3caee09947" containerName="mariadb-account-create" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.155934 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="564ff6dd-d3c4-48d0-b416-0b3caee09947" containerName="mariadb-account-create" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.159465 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.176966 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnnhv"] Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.289508 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgr6w\" (UniqueName: \"kubernetes.io/projected/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-kube-api-access-rgr6w\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.289608 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-catalog-content\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.289648 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-utilities\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.390956 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgr6w\" (UniqueName: \"kubernetes.io/projected/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-kube-api-access-rgr6w\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.391369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-catalog-content\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.391416 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-utilities\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.392104 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-utilities\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.392195 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-catalog-content\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.424068 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgr6w\" (UniqueName: \"kubernetes.io/projected/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-kube-api-access-rgr6w\") pod \"redhat-operators-nnnhv\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.506961 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.770895 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-p67pw"] Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.772712 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.776508 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6zj4d" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.780115 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.816079 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-p67pw"] Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.901488 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-config-data\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.901551 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-db-sync-config-data\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.901590 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-combined-ca-bundle\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:04 crc kubenswrapper[4958]: I1201 11:38:04.901758 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdr4k\" (UniqueName: \"kubernetes.io/projected/af559844-10a7-49dd-8b8a-58885d2c2206-kube-api-access-pdr4k\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.003612 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdr4k\" (UniqueName: \"kubernetes.io/projected/af559844-10a7-49dd-8b8a-58885d2c2206-kube-api-access-pdr4k\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.003923 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-config-data\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.003951 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-db-sync-config-data\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.003987 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-combined-ca-bundle\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.014194 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-combined-ca-bundle\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.014938 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnnhv"] Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.015778 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-config-data\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.019650 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-db-sync-config-data\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.023159 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdr4k\" (UniqueName: \"kubernetes.io/projected/af559844-10a7-49dd-8b8a-58885d2c2206-kube-api-access-pdr4k\") pod \"glance-db-sync-p67pw\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: W1201 11:38:05.028260 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cb0552_9eb8_44a6_9dc5_f40adbd98fea.slice/crio-e68300ff71fb4574f79db1bf21252d13ab224ed7802cd459a0d315aaefc79157 WatchSource:0}: Error finding container e68300ff71fb4574f79db1bf21252d13ab224ed7802cd459a0d315aaefc79157: Status 404 returned error can't find the container with id e68300ff71fb4574f79db1bf21252d13ab224ed7802cd459a0d315aaefc79157 Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.100837 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.664318 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-p67pw"] Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.809545 4958 generic.go:334] "Generic (PLEG): container finished" podID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerID="a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9" exitCode=0 Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.811594 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.816528 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerDied","Data":"a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9"} Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.816567 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerStarted","Data":"e68300ff71fb4574f79db1bf21252d13ab224ed7802cd459a0d315aaefc79157"} Dec 01 11:38:05 crc kubenswrapper[4958]: I1201 11:38:05.816582 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p67pw" event={"ID":"af559844-10a7-49dd-8b8a-58885d2c2206","Type":"ContainerStarted","Data":"f9b51bd6d9625402dcdf155dd021da384c4a81aad6774470a271181aa28b709d"} Dec 01 11:38:06 crc kubenswrapper[4958]: I1201 11:38:06.823379 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p67pw" event={"ID":"af559844-10a7-49dd-8b8a-58885d2c2206","Type":"ContainerStarted","Data":"606db69884e0cf0fb653ac2576d29d94f1bc4d8352295954366cc46f9c689c83"} Dec 01 11:38:06 crc kubenswrapper[4958]: I1201 11:38:06.838278 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-p67pw" podStartSLOduration=2.838262123 podStartE2EDuration="2.838262123s" podCreationTimestamp="2025-12-01 11:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:06.835893226 +0000 UTC m=+5934.344682263" watchObservedRunningTime="2025-12-01 11:38:06.838262123 +0000 UTC m=+5934.347051160" Dec 01 11:38:07 crc kubenswrapper[4958]: I1201 11:38:07.835568 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerStarted","Data":"60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962"} Dec 01 11:38:08 crc kubenswrapper[4958]: I1201 11:38:08.852644 4958 generic.go:334] "Generic (PLEG): container finished" podID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerID="60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962" exitCode=0 Dec 01 11:38:08 crc kubenswrapper[4958]: I1201 11:38:08.852703 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerDied","Data":"60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962"} Dec 01 11:38:09 crc kubenswrapper[4958]: I1201 11:38:09.797578 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:38:09 crc kubenswrapper[4958]: E1201 11:38:09.798129 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:38:09 crc kubenswrapper[4958]: I1201 11:38:09.871130 4958 generic.go:334] "Generic (PLEG): container finished" podID="af559844-10a7-49dd-8b8a-58885d2c2206" containerID="606db69884e0cf0fb653ac2576d29d94f1bc4d8352295954366cc46f9c689c83" exitCode=0 Dec 01 11:38:09 crc kubenswrapper[4958]: I1201 11:38:09.872130 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p67pw" event={"ID":"af559844-10a7-49dd-8b8a-58885d2c2206","Type":"ContainerDied","Data":"606db69884e0cf0fb653ac2576d29d94f1bc4d8352295954366cc46f9c689c83"} Dec 01 11:38:09 crc kubenswrapper[4958]: I1201 11:38:09.875657 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerStarted","Data":"bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3"} Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.360163 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.380529 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnnhv" podStartSLOduration=3.814063667 podStartE2EDuration="7.380509233s" podCreationTimestamp="2025-12-01 11:38:04 +0000 UTC" firstStartedPulling="2025-12-01 11:38:05.811200176 +0000 UTC m=+5933.319989213" lastFinishedPulling="2025-12-01 11:38:09.377645702 +0000 UTC m=+5936.886434779" observedRunningTime="2025-12-01 11:38:09.931786101 +0000 UTC m=+5937.440575178" watchObservedRunningTime="2025-12-01 11:38:11.380509233 +0000 UTC m=+5938.889298280" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.561389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-config-data\") pod \"af559844-10a7-49dd-8b8a-58885d2c2206\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.561496 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-db-sync-config-data\") pod \"af559844-10a7-49dd-8b8a-58885d2c2206\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.561609 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-combined-ca-bundle\") pod \"af559844-10a7-49dd-8b8a-58885d2c2206\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.561631 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdr4k\" (UniqueName: \"kubernetes.io/projected/af559844-10a7-49dd-8b8a-58885d2c2206-kube-api-access-pdr4k\") pod \"af559844-10a7-49dd-8b8a-58885d2c2206\" (UID: \"af559844-10a7-49dd-8b8a-58885d2c2206\") " Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.568422 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af559844-10a7-49dd-8b8a-58885d2c2206-kube-api-access-pdr4k" (OuterVolumeSpecName: "kube-api-access-pdr4k") pod "af559844-10a7-49dd-8b8a-58885d2c2206" (UID: "af559844-10a7-49dd-8b8a-58885d2c2206"). InnerVolumeSpecName "kube-api-access-pdr4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.576157 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "af559844-10a7-49dd-8b8a-58885d2c2206" (UID: "af559844-10a7-49dd-8b8a-58885d2c2206"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.600114 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af559844-10a7-49dd-8b8a-58885d2c2206" (UID: "af559844-10a7-49dd-8b8a-58885d2c2206"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.614561 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-config-data" (OuterVolumeSpecName: "config-data") pod "af559844-10a7-49dd-8b8a-58885d2c2206" (UID: "af559844-10a7-49dd-8b8a-58885d2c2206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.664038 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.664079 4958 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.664094 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af559844-10a7-49dd-8b8a-58885d2c2206-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.664106 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdr4k\" (UniqueName: \"kubernetes.io/projected/af559844-10a7-49dd-8b8a-58885d2c2206-kube-api-access-pdr4k\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.895092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-p67pw" event={"ID":"af559844-10a7-49dd-8b8a-58885d2c2206","Type":"ContainerDied","Data":"f9b51bd6d9625402dcdf155dd021da384c4a81aad6774470a271181aa28b709d"} Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.895142 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b51bd6d9625402dcdf155dd021da384c4a81aad6774470a271181aa28b709d" Dec 01 11:38:11 crc kubenswrapper[4958]: I1201 11:38:11.895149 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-p67pw" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.243102 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:12 crc kubenswrapper[4958]: E1201 11:38:12.243479 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af559844-10a7-49dd-8b8a-58885d2c2206" containerName="glance-db-sync" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.243495 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="af559844-10a7-49dd-8b8a-58885d2c2206" containerName="glance-db-sync" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.243670 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="af559844-10a7-49dd-8b8a-58885d2c2206" containerName="glance-db-sync" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.244607 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.252688 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.252714 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.253649 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6zj4d" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.260912 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.325416 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.377064 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.377157 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.378140 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-logs\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.378234 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.378300 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.378404 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7g5\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-kube-api-access-lq7g5\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.378433 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-ceph\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.378427 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5798555d65-82ng2"] Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.380409 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.401043 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5798555d65-82ng2"] Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480023 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480248 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-logs\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480303 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480350 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480400 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7g5\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-kube-api-access-lq7g5\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480421 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-ceph\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.480891 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-logs\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.481715 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.484061 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-ceph\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.484470 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.499603 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.500747 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.502398 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.513144 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.513415 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.517190 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7g5\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-kube-api-access-lq7g5\") pod \"glance-default-external-api-0\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.534919 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.566087 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583338 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-dns-svc\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583410 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6dxt\" (UniqueName: \"kubernetes.io/projected/083b940b-03ea-4fb2-a422-a417e6d0db13-kube-api-access-v6dxt\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583604 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-sb\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583749 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-config\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583779 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gtl\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-kube-api-access-x2gtl\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583821 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-nb\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583843 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583900 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583937 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583972 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.583999 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.584045 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.684724 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685131 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685164 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685192 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685215 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685246 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685296 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-dns-svc\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685328 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6dxt\" (UniqueName: \"kubernetes.io/projected/083b940b-03ea-4fb2-a422-a417e6d0db13-kube-api-access-v6dxt\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685374 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-sb\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685434 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-config\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685458 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gtl\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-kube-api-access-x2gtl\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.685488 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-nb\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.686507 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-nb\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.687456 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.689164 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-sb\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.689741 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-dns-svc\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.692408 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.693857 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-config\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.696661 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.715574 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.717435 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.726112 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.729880 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gtl\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-kube-api-access-x2gtl\") pod \"glance-default-internal-api-0\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.733807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6dxt\" (UniqueName: \"kubernetes.io/projected/083b940b-03ea-4fb2-a422-a417e6d0db13-kube-api-access-v6dxt\") pod \"dnsmasq-dns-5798555d65-82ng2\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:12 crc kubenswrapper[4958]: I1201 11:38:12.874288 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.007632 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.261067 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:13 crc kubenswrapper[4958]: W1201 11:38:13.267919 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5cf244_4d1e_4aae_b550_ae0489ea4d56.slice/crio-d9a0f3753e5078262c19d0f862368bc5461389e642f215a34e8ac89b475ae90c WatchSource:0}: Error finding container d9a0f3753e5078262c19d0f862368bc5461389e642f215a34e8ac89b475ae90c: Status 404 returned error can't find the container with id d9a0f3753e5078262c19d0f862368bc5461389e642f215a34e8ac89b475ae90c Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.514776 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.584191 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5798555d65-82ng2"] Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.603035 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.927928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5cf244-4d1e-4aae-b550-ae0489ea4d56","Type":"ContainerStarted","Data":"d9a0f3753e5078262c19d0f862368bc5461389e642f215a34e8ac89b475ae90c"} Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.929506 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798555d65-82ng2" event={"ID":"083b940b-03ea-4fb2-a422-a417e6d0db13","Type":"ContainerStarted","Data":"3753374131ba58ccdf54ca61afac9fd1cb09f8fa97af5768d3da1991f9ce67d2"} Dec 01 11:38:13 crc kubenswrapper[4958]: I1201 11:38:13.930754 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f","Type":"ContainerStarted","Data":"b9ca6eb156bc8890f88c580b77905f0d87fe9d19ca5cafe2cdd2c30cc883d32d"} Dec 01 11:38:14 crc kubenswrapper[4958]: I1201 11:38:14.507125 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:14 crc kubenswrapper[4958]: I1201 11:38:14.507484 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:14 crc kubenswrapper[4958]: E1201 11:38:14.715361 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod083b940b_03ea_4fb2_a422_a417e6d0db13.slice/crio-conmon-3c2984e8562709643a675bba9f44e52a97a7a83b4afc9e478950de706826d8a7.scope\": RecentStats: unable to find data in memory cache]" Dec 01 11:38:14 crc kubenswrapper[4958]: I1201 11:38:14.943661 4958 generic.go:334] "Generic (PLEG): container finished" podID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerID="3c2984e8562709643a675bba9f44e52a97a7a83b4afc9e478950de706826d8a7" exitCode=0 Dec 01 11:38:14 crc kubenswrapper[4958]: I1201 11:38:14.943764 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798555d65-82ng2" event={"ID":"083b940b-03ea-4fb2-a422-a417e6d0db13","Type":"ContainerDied","Data":"3c2984e8562709643a675bba9f44e52a97a7a83b4afc9e478950de706826d8a7"} Dec 01 11:38:14 crc kubenswrapper[4958]: I1201 11:38:14.950370 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f","Type":"ContainerStarted","Data":"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509"} Dec 01 11:38:14 crc kubenswrapper[4958]: I1201 11:38:14.951832 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5cf244-4d1e-4aae-b550-ae0489ea4d56","Type":"ContainerStarted","Data":"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f"} Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.561627 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nnnhv" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="registry-server" probeResult="failure" output=< Dec 01 11:38:15 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:38:15 crc kubenswrapper[4958]: > Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.673377 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.961196 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f","Type":"ContainerStarted","Data":"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434"} Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.961259 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-log" containerID="cri-o://f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509" gracePeriod=30 Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.961350 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-httpd" containerID="cri-o://6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434" gracePeriod=30 Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.963429 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5cf244-4d1e-4aae-b550-ae0489ea4d56","Type":"ContainerStarted","Data":"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c"} Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.963603 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-log" containerID="cri-o://3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f" gracePeriod=30 Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.963700 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-httpd" containerID="cri-o://ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c" gracePeriod=30 Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.966994 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798555d65-82ng2" event={"ID":"083b940b-03ea-4fb2-a422-a417e6d0db13","Type":"ContainerStarted","Data":"857bd95181ab17cba496f4289d4b22e44e9e1854b6307800b95828ced5eb4675"} Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.967467 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:15 crc kubenswrapper[4958]: I1201 11:38:15.992382 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.992359684 podStartE2EDuration="3.992359684s" podCreationTimestamp="2025-12-01 11:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:15.989233915 +0000 UTC m=+5943.498022952" watchObservedRunningTime="2025-12-01 11:38:15.992359684 +0000 UTC m=+5943.501148721" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.015210 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5798555d65-82ng2" podStartSLOduration=4.015191261 podStartE2EDuration="4.015191261s" podCreationTimestamp="2025-12-01 11:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:16.009183791 +0000 UTC m=+5943.517972838" watchObservedRunningTime="2025-12-01 11:38:16.015191261 +0000 UTC m=+5943.523980298" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.044852 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.0448224 podStartE2EDuration="4.0448224s" podCreationTimestamp="2025-12-01 11:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:16.030287568 +0000 UTC m=+5943.539076605" watchObservedRunningTime="2025-12-01 11:38:16.0448224 +0000 UTC m=+5943.553611437" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.495978 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508153 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-combined-ca-bundle\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508250 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-ceph\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508331 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-httpd-run\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508400 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-logs\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508502 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-config-data\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508567 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2gtl\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-kube-api-access-x2gtl\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508639 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-scripts\") pod \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\" (UID: \"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.508911 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.509214 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.509768 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-logs" (OuterVolumeSpecName: "logs") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.515581 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-scripts" (OuterVolumeSpecName: "scripts") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.515414 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-ceph" (OuterVolumeSpecName: "ceph") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.533296 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-kube-api-access-x2gtl" (OuterVolumeSpecName: "kube-api-access-x2gtl") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "kube-api-access-x2gtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.550881 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.563181 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-config-data" (OuterVolumeSpecName: "config-data") pod "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" (UID: "8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.627201 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.627248 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.627263 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2gtl\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-kube-api-access-x2gtl\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.627275 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.627286 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.627297 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.653887 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.728831 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-scripts\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.728971 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-logs\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729004 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-combined-ca-bundle\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729034 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-ceph\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729061 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-config-data\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729178 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-httpd-run\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729231 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq7g5\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-kube-api-access-lq7g5\") pod \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\" (UID: \"bf5cf244-4d1e-4aae-b550-ae0489ea4d56\") " Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729528 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-logs" (OuterVolumeSpecName: "logs") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729711 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729936 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.729956 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.732273 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-scripts" (OuterVolumeSpecName: "scripts") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.732823 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-kube-api-access-lq7g5" (OuterVolumeSpecName: "kube-api-access-lq7g5") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "kube-api-access-lq7g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.751586 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.752623 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-ceph" (OuterVolumeSpecName: "ceph") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.790443 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-config-data" (OuterVolumeSpecName: "config-data") pod "bf5cf244-4d1e-4aae-b550-ae0489ea4d56" (UID: "bf5cf244-4d1e-4aae-b550-ae0489ea4d56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.832445 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq7g5\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-kube-api-access-lq7g5\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.832963 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.833015 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.833034 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.833047 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5cf244-4d1e-4aae-b550-ae0489ea4d56-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979720 4958 generic.go:334] "Generic (PLEG): container finished" podID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerID="6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434" exitCode=143 Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979758 4958 generic.go:334] "Generic (PLEG): container finished" podID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerID="f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509" exitCode=143 Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979771 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979810 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f","Type":"ContainerDied","Data":"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434"} Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979863 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f","Type":"ContainerDied","Data":"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509"} Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f","Type":"ContainerDied","Data":"b9ca6eb156bc8890f88c580b77905f0d87fe9d19ca5cafe2cdd2c30cc883d32d"} Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.979908 4958 scope.go:117] "RemoveContainer" containerID="6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.984684 4958 generic.go:334] "Generic (PLEG): container finished" podID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerID="ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c" exitCode=0 Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.984744 4958 generic.go:334] "Generic (PLEG): container finished" podID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerID="3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f" exitCode=143 Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.985253 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.986280 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5cf244-4d1e-4aae-b550-ae0489ea4d56","Type":"ContainerDied","Data":"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c"} Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.986350 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5cf244-4d1e-4aae-b550-ae0489ea4d56","Type":"ContainerDied","Data":"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f"} Dec 01 11:38:16 crc kubenswrapper[4958]: I1201 11:38:16.986382 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf5cf244-4d1e-4aae-b550-ae0489ea4d56","Type":"ContainerDied","Data":"d9a0f3753e5078262c19d0f862368bc5461389e642f215a34e8ac89b475ae90c"} Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.035117 4958 scope.go:117] "RemoveContainer" containerID="f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.035489 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.069784 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.079012 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.081059 4958 scope.go:117] "RemoveContainer" containerID="6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.083207 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434\": container with ID starting with 6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434 not found: ID does not exist" containerID="6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.083248 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434"} err="failed to get container status \"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434\": rpc error: code = NotFound desc = could not find container \"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434\": container with ID starting with 6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434 not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.083274 4958 scope.go:117] "RemoveContainer" containerID="f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.084592 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.084606 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509\": container with ID starting with f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509 not found: ID does not exist" containerID="f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.084653 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509"} err="failed to get container status \"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509\": rpc error: code = NotFound desc = could not find container \"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509\": container with ID starting with f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509 not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.084694 4958 scope.go:117] "RemoveContainer" containerID="6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.085128 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-httpd" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085152 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-httpd" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.085180 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-log" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085193 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-log" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.085218 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-httpd" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085226 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-httpd" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.085248 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-log" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085259 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-log" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085513 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-log" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085541 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" containerName="glance-httpd" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085561 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-log" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.085576 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" containerName="glance-httpd" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.086376 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434"} err="failed to get container status \"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434\": rpc error: code = NotFound desc = could not find container \"6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434\": container with ID starting with 6ec371b1586a2bbf1edaf1c24d405233cfd1ef703703af8cfb74e50f97b68434 not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.086399 4958 scope.go:117] "RemoveContainer" containerID="f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.086812 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.087348 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509"} err="failed to get container status \"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509\": rpc error: code = NotFound desc = could not find container \"f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509\": container with ID starting with f682f0274cf245f42d7b6a6ce35771aed8ffb80abe97aec74865ac773ebe6509 not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.087383 4958 scope.go:117] "RemoveContainer" containerID="ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.088686 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6zj4d" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.088983 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.089211 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.089316 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.095461 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.106196 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.107803 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.112233 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.113675 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.124063 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.141717 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-ceph\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.141835 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-logs\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142017 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-logs\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142033 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142055 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142078 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142130 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142157 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142181 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142212 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142426 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb26s\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-kube-api-access-gb26s\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142491 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxzf\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-kube-api-access-kpxzf\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142555 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.142650 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.144362 4958 scope.go:117] "RemoveContainer" containerID="3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.178725 4958 scope.go:117] "RemoveContainer" containerID="ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.179689 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c\": container with ID starting with ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c not found: ID does not exist" containerID="ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.179723 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c"} err="failed to get container status \"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c\": rpc error: code = NotFound desc = could not find container \"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c\": container with ID starting with ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.179755 4958 scope.go:117] "RemoveContainer" containerID="3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f" Dec 01 11:38:17 crc kubenswrapper[4958]: E1201 11:38:17.180230 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f\": container with ID starting with 3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f not found: ID does not exist" containerID="3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.180285 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f"} err="failed to get container status \"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f\": rpc error: code = NotFound desc = could not find container \"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f\": container with ID starting with 3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.180334 4958 scope.go:117] "RemoveContainer" containerID="ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.180653 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c"} err="failed to get container status \"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c\": rpc error: code = NotFound desc = could not find container \"ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c\": container with ID starting with ea6dd1db15e08d63427524b9907cec86d72640cb2c35f33e220b623b7b61af1c not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.180676 4958 scope.go:117] "RemoveContainer" containerID="3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.180909 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f"} err="failed to get container status \"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f\": rpc error: code = NotFound desc = could not find container \"3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f\": container with ID starting with 3b4f139e7ccc7b9704f4835f0698b7c87f5a06a82723fe0fdb922c3a0e1f1b7f not found: ID does not exist" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244648 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-logs\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244713 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-logs\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244734 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244757 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244774 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244837 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244860 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244924 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244963 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb26s\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-kube-api-access-gb26s\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.244988 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxzf\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-kube-api-access-kpxzf\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.245015 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.245065 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.245086 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-ceph\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.245211 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-logs\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.245211 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-logs\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.245946 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.246148 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.250208 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-ceph\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.250358 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.250708 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.250924 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.251713 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.252265 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.252751 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.254753 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.262109 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb26s\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-kube-api-access-gb26s\") pod \"glance-default-internal-api-0\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.262483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxzf\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-kube-api-access-kpxzf\") pod \"glance-default-external-api-0\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.425474 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.439627 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.814393 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f" path="/var/lib/kubelet/pods/8ab45c7f-ffb2-44e2-8776-4b9e88f0fb7f/volumes" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.816142 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5cf244-4d1e-4aae-b550-ae0489ea4d56" path="/var/lib/kubelet/pods/bf5cf244-4d1e-4aae-b550-ae0489ea4d56/volumes" Dec 01 11:38:17 crc kubenswrapper[4958]: I1201 11:38:17.996736 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:38:18 crc kubenswrapper[4958]: I1201 11:38:18.100165 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:38:18 crc kubenswrapper[4958]: W1201 11:38:18.103921 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod735d94c9_1a28_41d5_ad7b_ce9d247bdff7.slice/crio-57b7720b4b12028fa168d7cd91fdee6ee10528a389e585d003e6fc7872fdc9e6 WatchSource:0}: Error finding container 57b7720b4b12028fa168d7cd91fdee6ee10528a389e585d003e6fc7872fdc9e6: Status 404 returned error can't find the container with id 57b7720b4b12028fa168d7cd91fdee6ee10528a389e585d003e6fc7872fdc9e6 Dec 01 11:38:19 crc kubenswrapper[4958]: I1201 11:38:19.016447 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a","Type":"ContainerStarted","Data":"cbebfdae0b767fbdf2b8955ba768a720154f73d1ca704c4af17e9caebd607a3d"} Dec 01 11:38:19 crc kubenswrapper[4958]: I1201 11:38:19.016806 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a","Type":"ContainerStarted","Data":"0f24379cb866c41a59c9b03209dfdaffc7ba10779adb58e412a75fdb170ee1da"} Dec 01 11:38:19 crc kubenswrapper[4958]: I1201 11:38:19.025550 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"735d94c9-1a28-41d5-ad7b-ce9d247bdff7","Type":"ContainerStarted","Data":"efb05262b5b46e12a167da1ac18dcff088498492ba3a10622cd142ecf97d29db"} Dec 01 11:38:19 crc kubenswrapper[4958]: I1201 11:38:19.025597 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"735d94c9-1a28-41d5-ad7b-ce9d247bdff7","Type":"ContainerStarted","Data":"57b7720b4b12028fa168d7cd91fdee6ee10528a389e585d003e6fc7872fdc9e6"} Dec 01 11:38:20 crc kubenswrapper[4958]: I1201 11:38:20.038869 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"735d94c9-1a28-41d5-ad7b-ce9d247bdff7","Type":"ContainerStarted","Data":"f5d3d24e1bd96f4f58364299107b6a03f47077a72296b98f20be0f9b504b4317"} Dec 01 11:38:20 crc kubenswrapper[4958]: I1201 11:38:20.042609 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a","Type":"ContainerStarted","Data":"600ed33174aa32b2178325e633f1d913824ebccf2419cde4691c89758c3c6f3f"} Dec 01 11:38:20 crc kubenswrapper[4958]: I1201 11:38:20.059649 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.059632297 podStartE2EDuration="3.059632297s" podCreationTimestamp="2025-12-01 11:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:20.05727067 +0000 UTC m=+5947.566059727" watchObservedRunningTime="2025-12-01 11:38:20.059632297 +0000 UTC m=+5947.568421344" Dec 01 11:38:20 crc kubenswrapper[4958]: I1201 11:38:20.092981 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.092956051 podStartE2EDuration="3.092956051s" podCreationTimestamp="2025-12-01 11:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:20.0788158 +0000 UTC m=+5947.587604877" watchObservedRunningTime="2025-12-01 11:38:20.092956051 +0000 UTC m=+5947.601745108" Dec 01 11:38:23 crc kubenswrapper[4958]: I1201 11:38:23.010105 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:38:23 crc kubenswrapper[4958]: I1201 11:38:23.105566 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f6ccf868c-dvfgm"] Dec 01 11:38:23 crc kubenswrapper[4958]: I1201 11:38:23.105946 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerName="dnsmasq-dns" containerID="cri-o://e398332e9682cf3fe35ab5e59943ed9934ba5cbc0b9d60d5b61a949e68e77e6e" gracePeriod=10 Dec 01 11:38:23 crc kubenswrapper[4958]: I1201 11:38:23.793890 4958 scope.go:117] "RemoveContainer" containerID="75c76a144b891b4118525d9449f2506d05a5237b3e9f32370f9898a7852ff3f7" Dec 01 11:38:23 crc kubenswrapper[4958]: I1201 11:38:23.844572 4958 scope.go:117] "RemoveContainer" containerID="cb5803023318113065b29f3d87e23b5836b4ab13d6acd9e4275542212a1e5e8b" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.114357 4958 generic.go:334] "Generic (PLEG): container finished" podID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerID="e398332e9682cf3fe35ab5e59943ed9934ba5cbc0b9d60d5b61a949e68e77e6e" exitCode=0 Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.114442 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" event={"ID":"468cc986-0b2c-4965-9e74-6d0b828a5001","Type":"ContainerDied","Data":"e398332e9682cf3fe35ab5e59943ed9934ba5cbc0b9d60d5b61a949e68e77e6e"} Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.114475 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" event={"ID":"468cc986-0b2c-4965-9e74-6d0b828a5001","Type":"ContainerDied","Data":"4eea2cfcbd97ed5d2299b524aae1bd4d7397fadae11fd6bf4464b9526f54f9d4"} Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.114489 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eea2cfcbd97ed5d2299b524aae1bd4d7397fadae11fd6bf4464b9526f54f9d4" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.141473 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.186699 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-sb\") pod \"468cc986-0b2c-4965-9e74-6d0b828a5001\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.186766 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-nb\") pod \"468cc986-0b2c-4965-9e74-6d0b828a5001\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.186884 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t6pw\" (UniqueName: \"kubernetes.io/projected/468cc986-0b2c-4965-9e74-6d0b828a5001-kube-api-access-5t6pw\") pod \"468cc986-0b2c-4965-9e74-6d0b828a5001\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.186959 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-dns-svc\") pod \"468cc986-0b2c-4965-9e74-6d0b828a5001\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.186994 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-config\") pod \"468cc986-0b2c-4965-9e74-6d0b828a5001\" (UID: \"468cc986-0b2c-4965-9e74-6d0b828a5001\") " Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.206266 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468cc986-0b2c-4965-9e74-6d0b828a5001-kube-api-access-5t6pw" (OuterVolumeSpecName: "kube-api-access-5t6pw") pod "468cc986-0b2c-4965-9e74-6d0b828a5001" (UID: "468cc986-0b2c-4965-9e74-6d0b828a5001"). InnerVolumeSpecName "kube-api-access-5t6pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.251000 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "468cc986-0b2c-4965-9e74-6d0b828a5001" (UID: "468cc986-0b2c-4965-9e74-6d0b828a5001"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.251101 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "468cc986-0b2c-4965-9e74-6d0b828a5001" (UID: "468cc986-0b2c-4965-9e74-6d0b828a5001"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.270090 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "468cc986-0b2c-4965-9e74-6d0b828a5001" (UID: "468cc986-0b2c-4965-9e74-6d0b828a5001"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.272623 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-config" (OuterVolumeSpecName: "config") pod "468cc986-0b2c-4965-9e74-6d0b828a5001" (UID: "468cc986-0b2c-4965-9e74-6d0b828a5001"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.289025 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.289074 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.289089 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.289102 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/468cc986-0b2c-4965-9e74-6d0b828a5001-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.289115 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t6pw\" (UniqueName: \"kubernetes.io/projected/468cc986-0b2c-4965-9e74-6d0b828a5001-kube-api-access-5t6pw\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.584082 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.640239 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.798566 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:38:24 crc kubenswrapper[4958]: E1201 11:38:24.799562 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:38:24 crc kubenswrapper[4958]: I1201 11:38:24.827334 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnnhv"] Dec 01 11:38:25 crc kubenswrapper[4958]: I1201 11:38:25.124169 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6ccf868c-dvfgm" Dec 01 11:38:25 crc kubenswrapper[4958]: I1201 11:38:25.172432 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f6ccf868c-dvfgm"] Dec 01 11:38:25 crc kubenswrapper[4958]: I1201 11:38:25.180091 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f6ccf868c-dvfgm"] Dec 01 11:38:25 crc kubenswrapper[4958]: I1201 11:38:25.819344 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" path="/var/lib/kubelet/pods/468cc986-0b2c-4965-9e74-6d0b828a5001/volumes" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.139432 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnnhv" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="registry-server" containerID="cri-o://bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3" gracePeriod=2 Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.718166 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.737379 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgr6w\" (UniqueName: \"kubernetes.io/projected/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-kube-api-access-rgr6w\") pod \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.737445 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-utilities\") pod \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.737541 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-catalog-content\") pod \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\" (UID: \"75cb0552-9eb8-44a6-9dc5-f40adbd98fea\") " Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.740963 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-utilities" (OuterVolumeSpecName: "utilities") pod "75cb0552-9eb8-44a6-9dc5-f40adbd98fea" (UID: "75cb0552-9eb8-44a6-9dc5-f40adbd98fea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.749051 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-kube-api-access-rgr6w" (OuterVolumeSpecName: "kube-api-access-rgr6w") pod "75cb0552-9eb8-44a6-9dc5-f40adbd98fea" (UID: "75cb0552-9eb8-44a6-9dc5-f40adbd98fea"). InnerVolumeSpecName "kube-api-access-rgr6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.839678 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgr6w\" (UniqueName: \"kubernetes.io/projected/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-kube-api-access-rgr6w\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.839711 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.909327 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75cb0552-9eb8-44a6-9dc5-f40adbd98fea" (UID: "75cb0552-9eb8-44a6-9dc5-f40adbd98fea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:26 crc kubenswrapper[4958]: I1201 11:38:26.942127 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75cb0552-9eb8-44a6-9dc5-f40adbd98fea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.158020 4958 generic.go:334] "Generic (PLEG): container finished" podID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerID="bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3" exitCode=0 Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.158102 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerDied","Data":"bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3"} Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.158135 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnnhv" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.158178 4958 scope.go:117] "RemoveContainer" containerID="bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.158155 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnnhv" event={"ID":"75cb0552-9eb8-44a6-9dc5-f40adbd98fea","Type":"ContainerDied","Data":"e68300ff71fb4574f79db1bf21252d13ab224ed7802cd459a0d315aaefc79157"} Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.187989 4958 scope.go:117] "RemoveContainer" containerID="60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.209875 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnnhv"] Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.219042 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnnhv"] Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.229579 4958 scope.go:117] "RemoveContainer" containerID="a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.271663 4958 scope.go:117] "RemoveContainer" containerID="bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3" Dec 01 11:38:27 crc kubenswrapper[4958]: E1201 11:38:27.272381 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3\": container with ID starting with bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3 not found: ID does not exist" containerID="bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.272448 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3"} err="failed to get container status \"bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3\": rpc error: code = NotFound desc = could not find container \"bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3\": container with ID starting with bdec96ba5c5927f6715fe28893980ac515ae52e4c7cad30883b3f27c74f6ebe3 not found: ID does not exist" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.272494 4958 scope.go:117] "RemoveContainer" containerID="60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962" Dec 01 11:38:27 crc kubenswrapper[4958]: E1201 11:38:27.273136 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962\": container with ID starting with 60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962 not found: ID does not exist" containerID="60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.273209 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962"} err="failed to get container status \"60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962\": rpc error: code = NotFound desc = could not find container \"60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962\": container with ID starting with 60161fcad07e187aa82a81de05feef6828c44b8fd9103ce3af142a6cb0917962 not found: ID does not exist" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.273236 4958 scope.go:117] "RemoveContainer" containerID="a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9" Dec 01 11:38:27 crc kubenswrapper[4958]: E1201 11:38:27.273546 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9\": container with ID starting with a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9 not found: ID does not exist" containerID="a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.273595 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9"} err="failed to get container status \"a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9\": rpc error: code = NotFound desc = could not find container \"a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9\": container with ID starting with a1f125c4483c083dab1995b742d8a68fda233f583a62155b34c8d85b6dad6df9 not found: ID does not exist" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.425659 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.426103 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.439911 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.440239 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.477171 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.490974 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.504114 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.504700 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 11:38:27 crc kubenswrapper[4958]: I1201 11:38:27.827253 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" path="/var/lib/kubelet/pods/75cb0552-9eb8-44a6-9dc5-f40adbd98fea/volumes" Dec 01 11:38:28 crc kubenswrapper[4958]: I1201 11:38:28.169641 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 11:38:28 crc kubenswrapper[4958]: I1201 11:38:28.169677 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 11:38:28 crc kubenswrapper[4958]: I1201 11:38:28.169687 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:28 crc kubenswrapper[4958]: I1201 11:38:28.169698 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:30 crc kubenswrapper[4958]: I1201 11:38:30.090279 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:30 crc kubenswrapper[4958]: I1201 11:38:30.104943 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 11:38:30 crc kubenswrapper[4958]: I1201 11:38:30.109095 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 11:38:30 crc kubenswrapper[4958]: I1201 11:38:30.122296 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.645094 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ssvm7"] Dec 01 11:38:38 crc kubenswrapper[4958]: E1201 11:38:38.646096 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="registry-server" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646111 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="registry-server" Dec 01 11:38:38 crc kubenswrapper[4958]: E1201 11:38:38.646126 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="extract-content" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646131 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="extract-content" Dec 01 11:38:38 crc kubenswrapper[4958]: E1201 11:38:38.646142 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerName="init" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646150 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerName="init" Dec 01 11:38:38 crc kubenswrapper[4958]: E1201 11:38:38.646162 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerName="dnsmasq-dns" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646168 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerName="dnsmasq-dns" Dec 01 11:38:38 crc kubenswrapper[4958]: E1201 11:38:38.646184 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="extract-utilities" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646190 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="extract-utilities" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646366 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="468cc986-0b2c-4965-9e74-6d0b828a5001" containerName="dnsmasq-dns" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.646387 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cb0552-9eb8-44a6-9dc5-f40adbd98fea" containerName="registry-server" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.647094 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.654998 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ssvm7"] Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.719459 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldxb2\" (UniqueName: \"kubernetes.io/projected/1bed473b-45f4-4fbf-b7ea-c23b554f578d-kube-api-access-ldxb2\") pod \"placement-db-create-ssvm7\" (UID: \"1bed473b-45f4-4fbf-b7ea-c23b554f578d\") " pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.798331 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:38:38 crc kubenswrapper[4958]: E1201 11:38:38.798657 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.821426 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldxb2\" (UniqueName: \"kubernetes.io/projected/1bed473b-45f4-4fbf-b7ea-c23b554f578d-kube-api-access-ldxb2\") pod \"placement-db-create-ssvm7\" (UID: \"1bed473b-45f4-4fbf-b7ea-c23b554f578d\") " pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.845824 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldxb2\" (UniqueName: \"kubernetes.io/projected/1bed473b-45f4-4fbf-b7ea-c23b554f578d-kube-api-access-ldxb2\") pod \"placement-db-create-ssvm7\" (UID: \"1bed473b-45f4-4fbf-b7ea-c23b554f578d\") " pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:38 crc kubenswrapper[4958]: I1201 11:38:38.974705 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:39 crc kubenswrapper[4958]: I1201 11:38:39.429487 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ssvm7"] Dec 01 11:38:39 crc kubenswrapper[4958]: W1201 11:38:39.434416 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bed473b_45f4_4fbf_b7ea_c23b554f578d.slice/crio-a6528723d5798f8f49efbfb590bbcc481a552e012436245785d3f672b0cf58a4 WatchSource:0}: Error finding container a6528723d5798f8f49efbfb590bbcc481a552e012436245785d3f672b0cf58a4: Status 404 returned error can't find the container with id a6528723d5798f8f49efbfb590bbcc481a552e012436245785d3f672b0cf58a4 Dec 01 11:38:40 crc kubenswrapper[4958]: I1201 11:38:40.299604 4958 generic.go:334] "Generic (PLEG): container finished" podID="1bed473b-45f4-4fbf-b7ea-c23b554f578d" containerID="a423a22c79025a9e0f919f5d574502a05822e350af70fbc09516ebd09cd7ea35" exitCode=0 Dec 01 11:38:40 crc kubenswrapper[4958]: I1201 11:38:40.299696 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ssvm7" event={"ID":"1bed473b-45f4-4fbf-b7ea-c23b554f578d","Type":"ContainerDied","Data":"a423a22c79025a9e0f919f5d574502a05822e350af70fbc09516ebd09cd7ea35"} Dec 01 11:38:40 crc kubenswrapper[4958]: I1201 11:38:40.300178 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ssvm7" event={"ID":"1bed473b-45f4-4fbf-b7ea-c23b554f578d","Type":"ContainerStarted","Data":"a6528723d5798f8f49efbfb590bbcc481a552e012436245785d3f672b0cf58a4"} Dec 01 11:38:41 crc kubenswrapper[4958]: I1201 11:38:41.764761 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:41 crc kubenswrapper[4958]: I1201 11:38:41.879014 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldxb2\" (UniqueName: \"kubernetes.io/projected/1bed473b-45f4-4fbf-b7ea-c23b554f578d-kube-api-access-ldxb2\") pod \"1bed473b-45f4-4fbf-b7ea-c23b554f578d\" (UID: \"1bed473b-45f4-4fbf-b7ea-c23b554f578d\") " Dec 01 11:38:41 crc kubenswrapper[4958]: I1201 11:38:41.886740 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bed473b-45f4-4fbf-b7ea-c23b554f578d-kube-api-access-ldxb2" (OuterVolumeSpecName: "kube-api-access-ldxb2") pod "1bed473b-45f4-4fbf-b7ea-c23b554f578d" (UID: "1bed473b-45f4-4fbf-b7ea-c23b554f578d"). InnerVolumeSpecName "kube-api-access-ldxb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:41 crc kubenswrapper[4958]: I1201 11:38:41.981909 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldxb2\" (UniqueName: \"kubernetes.io/projected/1bed473b-45f4-4fbf-b7ea-c23b554f578d-kube-api-access-ldxb2\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:42 crc kubenswrapper[4958]: I1201 11:38:42.331589 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ssvm7" event={"ID":"1bed473b-45f4-4fbf-b7ea-c23b554f578d","Type":"ContainerDied","Data":"a6528723d5798f8f49efbfb590bbcc481a552e012436245785d3f672b0cf58a4"} Dec 01 11:38:42 crc kubenswrapper[4958]: I1201 11:38:42.331641 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6528723d5798f8f49efbfb590bbcc481a552e012436245785d3f672b0cf58a4" Dec 01 11:38:42 crc kubenswrapper[4958]: I1201 11:38:42.331683 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ssvm7" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.741334 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-49a4-account-create-dh9qp"] Dec 01 11:38:48 crc kubenswrapper[4958]: E1201 11:38:48.742433 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bed473b-45f4-4fbf-b7ea-c23b554f578d" containerName="mariadb-database-create" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.742452 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bed473b-45f4-4fbf-b7ea-c23b554f578d" containerName="mariadb-database-create" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.742672 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bed473b-45f4-4fbf-b7ea-c23b554f578d" containerName="mariadb-database-create" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.743404 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.752578 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.754999 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-49a4-account-create-dh9qp"] Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.848303 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dpdq\" (UniqueName: \"kubernetes.io/projected/68bf25e4-bda1-497a-9510-4a06a63e5138-kube-api-access-4dpdq\") pod \"placement-49a4-account-create-dh9qp\" (UID: \"68bf25e4-bda1-497a-9510-4a06a63e5138\") " pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.949923 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dpdq\" (UniqueName: \"kubernetes.io/projected/68bf25e4-bda1-497a-9510-4a06a63e5138-kube-api-access-4dpdq\") pod \"placement-49a4-account-create-dh9qp\" (UID: \"68bf25e4-bda1-497a-9510-4a06a63e5138\") " pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:48 crc kubenswrapper[4958]: I1201 11:38:48.971969 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dpdq\" (UniqueName: \"kubernetes.io/projected/68bf25e4-bda1-497a-9510-4a06a63e5138-kube-api-access-4dpdq\") pod \"placement-49a4-account-create-dh9qp\" (UID: \"68bf25e4-bda1-497a-9510-4a06a63e5138\") " pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:49 crc kubenswrapper[4958]: I1201 11:38:49.112253 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:49 crc kubenswrapper[4958]: I1201 11:38:49.582959 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-49a4-account-create-dh9qp"] Dec 01 11:38:50 crc kubenswrapper[4958]: I1201 11:38:50.430234 4958 generic.go:334] "Generic (PLEG): container finished" podID="68bf25e4-bda1-497a-9510-4a06a63e5138" containerID="31696ea308d2423cc2605ca9821bc97f8e226233d3cc9b0bf9f306681d3d08b8" exitCode=0 Dec 01 11:38:50 crc kubenswrapper[4958]: I1201 11:38:50.430327 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-49a4-account-create-dh9qp" event={"ID":"68bf25e4-bda1-497a-9510-4a06a63e5138","Type":"ContainerDied","Data":"31696ea308d2423cc2605ca9821bc97f8e226233d3cc9b0bf9f306681d3d08b8"} Dec 01 11:38:50 crc kubenswrapper[4958]: I1201 11:38:50.430416 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-49a4-account-create-dh9qp" event={"ID":"68bf25e4-bda1-497a-9510-4a06a63e5138","Type":"ContainerStarted","Data":"c6329d1278856b79ab4e43bc8df17053517b8660bf0a2aa8bd02b82d6e4f4756"} Dec 01 11:38:50 crc kubenswrapper[4958]: I1201 11:38:50.797580 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:38:50 crc kubenswrapper[4958]: E1201 11:38:50.797930 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:38:51 crc kubenswrapper[4958]: I1201 11:38:51.788608 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:51 crc kubenswrapper[4958]: I1201 11:38:51.915300 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dpdq\" (UniqueName: \"kubernetes.io/projected/68bf25e4-bda1-497a-9510-4a06a63e5138-kube-api-access-4dpdq\") pod \"68bf25e4-bda1-497a-9510-4a06a63e5138\" (UID: \"68bf25e4-bda1-497a-9510-4a06a63e5138\") " Dec 01 11:38:51 crc kubenswrapper[4958]: I1201 11:38:51.927317 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bf25e4-bda1-497a-9510-4a06a63e5138-kube-api-access-4dpdq" (OuterVolumeSpecName: "kube-api-access-4dpdq") pod "68bf25e4-bda1-497a-9510-4a06a63e5138" (UID: "68bf25e4-bda1-497a-9510-4a06a63e5138"). InnerVolumeSpecName "kube-api-access-4dpdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:52 crc kubenswrapper[4958]: I1201 11:38:52.018411 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dpdq\" (UniqueName: \"kubernetes.io/projected/68bf25e4-bda1-497a-9510-4a06a63e5138-kube-api-access-4dpdq\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:52 crc kubenswrapper[4958]: I1201 11:38:52.453663 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49a4-account-create-dh9qp" Dec 01 11:38:52 crc kubenswrapper[4958]: I1201 11:38:52.453640 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-49a4-account-create-dh9qp" event={"ID":"68bf25e4-bda1-497a-9510-4a06a63e5138","Type":"ContainerDied","Data":"c6329d1278856b79ab4e43bc8df17053517b8660bf0a2aa8bd02b82d6e4f4756"} Dec 01 11:38:52 crc kubenswrapper[4958]: I1201 11:38:52.453913 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6329d1278856b79ab4e43bc8df17053517b8660bf0a2aa8bd02b82d6e4f4756" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.090474 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f96cb89d9-pz99v"] Dec 01 11:38:54 crc kubenswrapper[4958]: E1201 11:38:54.091184 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68bf25e4-bda1-497a-9510-4a06a63e5138" containerName="mariadb-account-create" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.091199 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bf25e4-bda1-497a-9510-4a06a63e5138" containerName="mariadb-account-create" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.091383 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="68bf25e4-bda1-497a-9510-4a06a63e5138" containerName="mariadb-account-create" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.092510 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.101082 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f96cb89d9-pz99v"] Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.141869 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-kfczd"] Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.143924 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.146573 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.149131 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.152331 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9q4w8" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.154073 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kfczd"] Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.164545 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pz9s\" (UniqueName: \"kubernetes.io/projected/5ce26e97-9a7c-43b5-8355-6787ece6d948-kube-api-access-8pz9s\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.164613 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-nb\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.164690 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-sb\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.164717 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-dns-svc\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.164758 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-config\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.266715 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71967bd5-6523-4a00-86c5-a43c5994f71a-logs\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.266778 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-sb\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.266803 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-dns-svc\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.266927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-config\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.266954 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-combined-ca-bundle\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.266984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-scripts\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.267021 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pz9s\" (UniqueName: \"kubernetes.io/projected/5ce26e97-9a7c-43b5-8355-6787ece6d948-kube-api-access-8pz9s\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.267038 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-config-data\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.267062 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-nb\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.267111 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9x2\" (UniqueName: \"kubernetes.io/projected/71967bd5-6523-4a00-86c5-a43c5994f71a-kube-api-access-gm9x2\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.267785 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-sb\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.267785 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-dns-svc\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.268228 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-nb\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.268773 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-config\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.286732 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pz9s\" (UniqueName: \"kubernetes.io/projected/5ce26e97-9a7c-43b5-8355-6787ece6d948-kube-api-access-8pz9s\") pod \"dnsmasq-dns-5f96cb89d9-pz99v\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.369677 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-combined-ca-bundle\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.369929 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-scripts\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.370074 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-config-data\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.371961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9x2\" (UniqueName: \"kubernetes.io/projected/71967bd5-6523-4a00-86c5-a43c5994f71a-kube-api-access-gm9x2\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.372596 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71967bd5-6523-4a00-86c5-a43c5994f71a-logs\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.373282 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71967bd5-6523-4a00-86c5-a43c5994f71a-logs\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.377277 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-scripts\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.377731 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-combined-ca-bundle\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.377781 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-config-data\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.390700 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9x2\" (UniqueName: \"kubernetes.io/projected/71967bd5-6523-4a00-86c5-a43c5994f71a-kube-api-access-gm9x2\") pod \"placement-db-sync-kfczd\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.419398 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.461348 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:54 crc kubenswrapper[4958]: I1201 11:38:54.929462 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f96cb89d9-pz99v"] Dec 01 11:38:55 crc kubenswrapper[4958]: W1201 11:38:55.042303 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71967bd5_6523_4a00_86c5_a43c5994f71a.slice/crio-6471837ca6b4f2a06ba8b4dd6d2bf8b9f730cc653fccce3f321bcd5acc55f616 WatchSource:0}: Error finding container 6471837ca6b4f2a06ba8b4dd6d2bf8b9f730cc653fccce3f321bcd5acc55f616: Status 404 returned error can't find the container with id 6471837ca6b4f2a06ba8b4dd6d2bf8b9f730cc653fccce3f321bcd5acc55f616 Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.050571 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-kfczd"] Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.539203 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kfczd" event={"ID":"71967bd5-6523-4a00-86c5-a43c5994f71a","Type":"ContainerStarted","Data":"83a43c55ef38b8c052a5ceebe966a0e40a437d035eea8358f995e418d88f7351"} Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.539582 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kfczd" event={"ID":"71967bd5-6523-4a00-86c5-a43c5994f71a","Type":"ContainerStarted","Data":"6471837ca6b4f2a06ba8b4dd6d2bf8b9f730cc653fccce3f321bcd5acc55f616"} Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.555865 4958 generic.go:334] "Generic (PLEG): container finished" podID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerID="4660dcac7a423c49314b7edccb0bc5ff621cc8bcc1f6c5463f92fddb1ee83b57" exitCode=0 Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.555914 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" event={"ID":"5ce26e97-9a7c-43b5-8355-6787ece6d948","Type":"ContainerDied","Data":"4660dcac7a423c49314b7edccb0bc5ff621cc8bcc1f6c5463f92fddb1ee83b57"} Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.555940 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" event={"ID":"5ce26e97-9a7c-43b5-8355-6787ece6d948","Type":"ContainerStarted","Data":"78390a0e7b021f1def7af9410689cff8b01866a8ffda0f8be532d3751b178689"} Dec 01 11:38:55 crc kubenswrapper[4958]: I1201 11:38:55.594022 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-kfczd" podStartSLOduration=1.5939978529999999 podStartE2EDuration="1.593997853s" podCreationTimestamp="2025-12-01 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:55.568406028 +0000 UTC m=+5983.077195075" watchObservedRunningTime="2025-12-01 11:38:55.593997853 +0000 UTC m=+5983.102786890" Dec 01 11:38:56 crc kubenswrapper[4958]: I1201 11:38:56.572106 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" event={"ID":"5ce26e97-9a7c-43b5-8355-6787ece6d948","Type":"ContainerStarted","Data":"b5dbc0379892a37fd9c04b78af98b7de12b1aa42e5bfcd7452e07c49b66b0033"} Dec 01 11:38:56 crc kubenswrapper[4958]: I1201 11:38:56.572517 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:38:56 crc kubenswrapper[4958]: I1201 11:38:56.614424 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" podStartSLOduration=2.614400861 podStartE2EDuration="2.614400861s" podCreationTimestamp="2025-12-01 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:38:56.598451439 +0000 UTC m=+5984.107240476" watchObservedRunningTime="2025-12-01 11:38:56.614400861 +0000 UTC m=+5984.123189898" Dec 01 11:38:57 crc kubenswrapper[4958]: I1201 11:38:57.580620 4958 generic.go:334] "Generic (PLEG): container finished" podID="71967bd5-6523-4a00-86c5-a43c5994f71a" containerID="83a43c55ef38b8c052a5ceebe966a0e40a437d035eea8358f995e418d88f7351" exitCode=0 Dec 01 11:38:57 crc kubenswrapper[4958]: I1201 11:38:57.581380 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kfczd" event={"ID":"71967bd5-6523-4a00-86c5-a43c5994f71a","Type":"ContainerDied","Data":"83a43c55ef38b8c052a5ceebe966a0e40a437d035eea8358f995e418d88f7351"} Dec 01 11:38:58 crc kubenswrapper[4958]: I1201 11:38:58.954610 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.077149 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-combined-ca-bundle\") pod \"71967bd5-6523-4a00-86c5-a43c5994f71a\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.077278 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71967bd5-6523-4a00-86c5-a43c5994f71a-logs\") pod \"71967bd5-6523-4a00-86c5-a43c5994f71a\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.077323 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-scripts\") pod \"71967bd5-6523-4a00-86c5-a43c5994f71a\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.077368 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-config-data\") pod \"71967bd5-6523-4a00-86c5-a43c5994f71a\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.077606 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9x2\" (UniqueName: \"kubernetes.io/projected/71967bd5-6523-4a00-86c5-a43c5994f71a-kube-api-access-gm9x2\") pod \"71967bd5-6523-4a00-86c5-a43c5994f71a\" (UID: \"71967bd5-6523-4a00-86c5-a43c5994f71a\") " Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.079018 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71967bd5-6523-4a00-86c5-a43c5994f71a-logs" (OuterVolumeSpecName: "logs") pod "71967bd5-6523-4a00-86c5-a43c5994f71a" (UID: "71967bd5-6523-4a00-86c5-a43c5994f71a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.085510 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-scripts" (OuterVolumeSpecName: "scripts") pod "71967bd5-6523-4a00-86c5-a43c5994f71a" (UID: "71967bd5-6523-4a00-86c5-a43c5994f71a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.087275 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71967bd5-6523-4a00-86c5-a43c5994f71a-kube-api-access-gm9x2" (OuterVolumeSpecName: "kube-api-access-gm9x2") pod "71967bd5-6523-4a00-86c5-a43c5994f71a" (UID: "71967bd5-6523-4a00-86c5-a43c5994f71a"). InnerVolumeSpecName "kube-api-access-gm9x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.114768 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71967bd5-6523-4a00-86c5-a43c5994f71a" (UID: "71967bd5-6523-4a00-86c5-a43c5994f71a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.117625 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-config-data" (OuterVolumeSpecName: "config-data") pod "71967bd5-6523-4a00-86c5-a43c5994f71a" (UID: "71967bd5-6523-4a00-86c5-a43c5994f71a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.180144 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71967bd5-6523-4a00-86c5-a43c5994f71a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.180179 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.180189 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.180202 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9x2\" (UniqueName: \"kubernetes.io/projected/71967bd5-6523-4a00-86c5-a43c5994f71a-kube-api-access-gm9x2\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.180214 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71967bd5-6523-4a00-86c5-a43c5994f71a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.626360 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-kfczd" event={"ID":"71967bd5-6523-4a00-86c5-a43c5994f71a","Type":"ContainerDied","Data":"6471837ca6b4f2a06ba8b4dd6d2bf8b9f730cc653fccce3f321bcd5acc55f616"} Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.626410 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6471837ca6b4f2a06ba8b4dd6d2bf8b9f730cc653fccce3f321bcd5acc55f616" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.626497 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-kfczd" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.710745 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-599f975964-fzz2m"] Dec 01 11:38:59 crc kubenswrapper[4958]: E1201 11:38:59.711159 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71967bd5-6523-4a00-86c5-a43c5994f71a" containerName="placement-db-sync" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.711177 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="71967bd5-6523-4a00-86c5-a43c5994f71a" containerName="placement-db-sync" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.711447 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="71967bd5-6523-4a00-86c5-a43c5994f71a" containerName="placement-db-sync" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.712421 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.714246 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.714681 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9q4w8" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.715651 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.719762 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-599f975964-fzz2m"] Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.792485 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-combined-ca-bundle\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.792539 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9h2z\" (UniqueName: \"kubernetes.io/projected/077b5544-ef34-459b-9323-85996b4dbb12-kube-api-access-j9h2z\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.792578 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-scripts\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.792606 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-config-data\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.792668 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077b5544-ef34-459b-9323-85996b4dbb12-logs\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.939750 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-combined-ca-bundle\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.939809 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9h2z\" (UniqueName: \"kubernetes.io/projected/077b5544-ef34-459b-9323-85996b4dbb12-kube-api-access-j9h2z\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.939877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-scripts\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.939904 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-config-data\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.939941 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077b5544-ef34-459b-9323-85996b4dbb12-logs\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.940379 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/077b5544-ef34-459b-9323-85996b4dbb12-logs\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.946456 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-scripts\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.948800 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-combined-ca-bundle\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.957890 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077b5544-ef34-459b-9323-85996b4dbb12-config-data\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:38:59 crc kubenswrapper[4958]: I1201 11:38:59.960861 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9h2z\" (UniqueName: \"kubernetes.io/projected/077b5544-ef34-459b-9323-85996b4dbb12-kube-api-access-j9h2z\") pod \"placement-599f975964-fzz2m\" (UID: \"077b5544-ef34-459b-9323-85996b4dbb12\") " pod="openstack/placement-599f975964-fzz2m" Dec 01 11:39:00 crc kubenswrapper[4958]: I1201 11:39:00.088689 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-599f975964-fzz2m" Dec 01 11:39:00 crc kubenswrapper[4958]: I1201 11:39:00.580104 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-599f975964-fzz2m"] Dec 01 11:39:00 crc kubenswrapper[4958]: W1201 11:39:00.581824 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077b5544_ef34_459b_9323_85996b4dbb12.slice/crio-594c8b926612bed3e8f062220aa40e33a696be735eeee9e0150afe197a932181 WatchSource:0}: Error finding container 594c8b926612bed3e8f062220aa40e33a696be735eeee9e0150afe197a932181: Status 404 returned error can't find the container with id 594c8b926612bed3e8f062220aa40e33a696be735eeee9e0150afe197a932181 Dec 01 11:39:00 crc kubenswrapper[4958]: I1201 11:39:00.636297 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599f975964-fzz2m" event={"ID":"077b5544-ef34-459b-9323-85996b4dbb12","Type":"ContainerStarted","Data":"594c8b926612bed3e8f062220aa40e33a696be735eeee9e0150afe197a932181"} Dec 01 11:39:01 crc kubenswrapper[4958]: I1201 11:39:01.652051 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599f975964-fzz2m" event={"ID":"077b5544-ef34-459b-9323-85996b4dbb12","Type":"ContainerStarted","Data":"7efa4ce24f1dbc7335512ae1875f0d7834dc8d0d378b9e2306518726e88f5509"} Dec 01 11:39:01 crc kubenswrapper[4958]: I1201 11:39:01.653103 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-599f975964-fzz2m" Dec 01 11:39:01 crc kubenswrapper[4958]: I1201 11:39:01.653162 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599f975964-fzz2m" event={"ID":"077b5544-ef34-459b-9323-85996b4dbb12","Type":"ContainerStarted","Data":"982635772a1d7506e5682f299c73ecbc532c4424e265a228626ea4361a929625"} Dec 01 11:39:01 crc kubenswrapper[4958]: I1201 11:39:01.680962 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-599f975964-fzz2m" podStartSLOduration=2.680940965 podStartE2EDuration="2.680940965s" podCreationTimestamp="2025-12-01 11:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:39:01.679214206 +0000 UTC m=+5989.188003283" watchObservedRunningTime="2025-12-01 11:39:01.680940965 +0000 UTC m=+5989.189730022" Dec 01 11:39:02 crc kubenswrapper[4958]: I1201 11:39:02.666134 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-599f975964-fzz2m" Dec 01 11:39:02 crc kubenswrapper[4958]: I1201 11:39:02.798542 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:39:02 crc kubenswrapper[4958]: E1201 11:39:02.799129 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:39:04 crc kubenswrapper[4958]: I1201 11:39:04.422184 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:39:04 crc kubenswrapper[4958]: I1201 11:39:04.507983 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5798555d65-82ng2"] Dec 01 11:39:04 crc kubenswrapper[4958]: I1201 11:39:04.508335 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5798555d65-82ng2" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerName="dnsmasq-dns" containerID="cri-o://857bd95181ab17cba496f4289d4b22e44e9e1854b6307800b95828ced5eb4675" gracePeriod=10 Dec 01 11:39:04 crc kubenswrapper[4958]: I1201 11:39:04.689191 4958 generic.go:334] "Generic (PLEG): container finished" podID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerID="857bd95181ab17cba496f4289d4b22e44e9e1854b6307800b95828ced5eb4675" exitCode=0 Dec 01 11:39:04 crc kubenswrapper[4958]: I1201 11:39:04.689243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798555d65-82ng2" event={"ID":"083b940b-03ea-4fb2-a422-a417e6d0db13","Type":"ContainerDied","Data":"857bd95181ab17cba496f4289d4b22e44e9e1854b6307800b95828ced5eb4675"} Dec 01 11:39:04 crc kubenswrapper[4958]: I1201 11:39:04.998707 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.147698 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6dxt\" (UniqueName: \"kubernetes.io/projected/083b940b-03ea-4fb2-a422-a417e6d0db13-kube-api-access-v6dxt\") pod \"083b940b-03ea-4fb2-a422-a417e6d0db13\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.147834 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-dns-svc\") pod \"083b940b-03ea-4fb2-a422-a417e6d0db13\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.147911 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-sb\") pod \"083b940b-03ea-4fb2-a422-a417e6d0db13\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.148073 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-nb\") pod \"083b940b-03ea-4fb2-a422-a417e6d0db13\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.148109 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-config\") pod \"083b940b-03ea-4fb2-a422-a417e6d0db13\" (UID: \"083b940b-03ea-4fb2-a422-a417e6d0db13\") " Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.153869 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/083b940b-03ea-4fb2-a422-a417e6d0db13-kube-api-access-v6dxt" (OuterVolumeSpecName: "kube-api-access-v6dxt") pod "083b940b-03ea-4fb2-a422-a417e6d0db13" (UID: "083b940b-03ea-4fb2-a422-a417e6d0db13"). InnerVolumeSpecName "kube-api-access-v6dxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.196957 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "083b940b-03ea-4fb2-a422-a417e6d0db13" (UID: "083b940b-03ea-4fb2-a422-a417e6d0db13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.203604 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "083b940b-03ea-4fb2-a422-a417e6d0db13" (UID: "083b940b-03ea-4fb2-a422-a417e6d0db13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.203741 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "083b940b-03ea-4fb2-a422-a417e6d0db13" (UID: "083b940b-03ea-4fb2-a422-a417e6d0db13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.207192 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-config" (OuterVolumeSpecName: "config") pod "083b940b-03ea-4fb2-a422-a417e6d0db13" (UID: "083b940b-03ea-4fb2-a422-a417e6d0db13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.250556 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.250593 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.250605 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.250614 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083b940b-03ea-4fb2-a422-a417e6d0db13-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.250623 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6dxt\" (UniqueName: \"kubernetes.io/projected/083b940b-03ea-4fb2-a422-a417e6d0db13-kube-api-access-v6dxt\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.704304 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5798555d65-82ng2" event={"ID":"083b940b-03ea-4fb2-a422-a417e6d0db13","Type":"ContainerDied","Data":"3753374131ba58ccdf54ca61afac9fd1cb09f8fa97af5768d3da1991f9ce67d2"} Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.704370 4958 scope.go:117] "RemoveContainer" containerID="857bd95181ab17cba496f4289d4b22e44e9e1854b6307800b95828ced5eb4675" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.704537 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5798555d65-82ng2" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.742146 4958 scope.go:117] "RemoveContainer" containerID="3c2984e8562709643a675bba9f44e52a97a7a83b4afc9e478950de706826d8a7" Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.751011 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5798555d65-82ng2"] Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.761409 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5798555d65-82ng2"] Dec 01 11:39:05 crc kubenswrapper[4958]: I1201 11:39:05.813895 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" path="/var/lib/kubelet/pods/083b940b-03ea-4fb2-a422-a417e6d0db13/volumes" Dec 01 11:39:17 crc kubenswrapper[4958]: I1201 11:39:17.798345 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:39:17 crc kubenswrapper[4958]: E1201 11:39:17.799706 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:39:30 crc kubenswrapper[4958]: I1201 11:39:30.797194 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:39:31 crc kubenswrapper[4958]: I1201 11:39:31.140884 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-599f975964-fzz2m" Dec 01 11:39:31 crc kubenswrapper[4958]: I1201 11:39:31.275776 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-599f975964-fzz2m" Dec 01 11:39:32 crc kubenswrapper[4958]: I1201 11:39:32.031464 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"13cb6cc4ee40481ad64b2a05cd4d173653f521017a8a82d0d6cdc46228d66885"} Dec 01 11:39:37 crc kubenswrapper[4958]: I1201 11:39:37.956222 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j7p47"] Dec 01 11:39:37 crc kubenswrapper[4958]: E1201 11:39:37.959871 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerName="dnsmasq-dns" Dec 01 11:39:37 crc kubenswrapper[4958]: I1201 11:39:37.959911 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerName="dnsmasq-dns" Dec 01 11:39:37 crc kubenswrapper[4958]: E1201 11:39:37.959955 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerName="init" Dec 01 11:39:37 crc kubenswrapper[4958]: I1201 11:39:37.959967 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerName="init" Dec 01 11:39:37 crc kubenswrapper[4958]: I1201 11:39:37.961196 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="083b940b-03ea-4fb2-a422-a417e6d0db13" containerName="dnsmasq-dns" Dec 01 11:39:37 crc kubenswrapper[4958]: I1201 11:39:37.964977 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:37 crc kubenswrapper[4958]: I1201 11:39:37.978047 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7p47"] Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.027454 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-utilities\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.027615 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8d4x\" (UniqueName: \"kubernetes.io/projected/d543b8a8-4974-4d2f-980f-c7cffd3fca61-kube-api-access-h8d4x\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.027659 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-catalog-content\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.129432 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8d4x\" (UniqueName: \"kubernetes.io/projected/d543b8a8-4974-4d2f-980f-c7cffd3fca61-kube-api-access-h8d4x\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.129498 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-catalog-content\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.129609 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-utilities\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.130410 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-utilities\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.130418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-catalog-content\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.152443 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8d4x\" (UniqueName: \"kubernetes.io/projected/d543b8a8-4974-4d2f-980f-c7cffd3fca61-kube-api-access-h8d4x\") pod \"community-operators-j7p47\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:38 crc kubenswrapper[4958]: I1201 11:39:38.310864 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:39 crc kubenswrapper[4958]: W1201 11:39:39.438466 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd543b8a8_4974_4d2f_980f_c7cffd3fca61.slice/crio-204e1b3ea7eb5ee8b24e7132fb3663a11605a48f574d526518afcfb4b7d9af32 WatchSource:0}: Error finding container 204e1b3ea7eb5ee8b24e7132fb3663a11605a48f574d526518afcfb4b7d9af32: Status 404 returned error can't find the container with id 204e1b3ea7eb5ee8b24e7132fb3663a11605a48f574d526518afcfb4b7d9af32 Dec 01 11:39:39 crc kubenswrapper[4958]: I1201 11:39:39.439414 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7p47"] Dec 01 11:39:40 crc kubenswrapper[4958]: I1201 11:39:40.115080 4958 generic.go:334] "Generic (PLEG): container finished" podID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerID="e7507cc7ac70613ec3b4941240fd4cbfd2dac545b3b1c88acf38b294732666ed" exitCode=0 Dec 01 11:39:40 crc kubenswrapper[4958]: I1201 11:39:40.115184 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerDied","Data":"e7507cc7ac70613ec3b4941240fd4cbfd2dac545b3b1c88acf38b294732666ed"} Dec 01 11:39:40 crc kubenswrapper[4958]: I1201 11:39:40.115720 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerStarted","Data":"204e1b3ea7eb5ee8b24e7132fb3663a11605a48f574d526518afcfb4b7d9af32"} Dec 01 11:39:41 crc kubenswrapper[4958]: I1201 11:39:41.131233 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerStarted","Data":"e80b6fde790906ce111804d4fba98f011876c0d06431e1cf290d6cbd43c3e248"} Dec 01 11:39:42 crc kubenswrapper[4958]: I1201 11:39:42.145325 4958 generic.go:334] "Generic (PLEG): container finished" podID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerID="e80b6fde790906ce111804d4fba98f011876c0d06431e1cf290d6cbd43c3e248" exitCode=0 Dec 01 11:39:42 crc kubenswrapper[4958]: I1201 11:39:42.145432 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerDied","Data":"e80b6fde790906ce111804d4fba98f011876c0d06431e1cf290d6cbd43c3e248"} Dec 01 11:39:43 crc kubenswrapper[4958]: I1201 11:39:43.157264 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerStarted","Data":"e120ed22e8152e166ae2c2c84da9216ced3fab7951eb96c4b927b69d6ec6a48f"} Dec 01 11:39:43 crc kubenswrapper[4958]: I1201 11:39:43.189480 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j7p47" podStartSLOduration=3.647416162 podStartE2EDuration="6.189460088s" podCreationTimestamp="2025-12-01 11:39:37 +0000 UTC" firstStartedPulling="2025-12-01 11:39:40.119141016 +0000 UTC m=+6027.627930083" lastFinishedPulling="2025-12-01 11:39:42.661184932 +0000 UTC m=+6030.169974009" observedRunningTime="2025-12-01 11:39:43.178535408 +0000 UTC m=+6030.687324445" watchObservedRunningTime="2025-12-01 11:39:43.189460088 +0000 UTC m=+6030.698249135" Dec 01 11:39:48 crc kubenswrapper[4958]: I1201 11:39:48.312003 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:48 crc kubenswrapper[4958]: I1201 11:39:48.312712 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:48 crc kubenswrapper[4958]: I1201 11:39:48.374312 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:49 crc kubenswrapper[4958]: I1201 11:39:49.321931 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:49 crc kubenswrapper[4958]: I1201 11:39:49.542872 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7p47"] Dec 01 11:39:51 crc kubenswrapper[4958]: I1201 11:39:51.251032 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j7p47" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="registry-server" containerID="cri-o://e120ed22e8152e166ae2c2c84da9216ced3fab7951eb96c4b927b69d6ec6a48f" gracePeriod=2 Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.261757 4958 generic.go:334] "Generic (PLEG): container finished" podID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerID="e120ed22e8152e166ae2c2c84da9216ced3fab7951eb96c4b927b69d6ec6a48f" exitCode=0 Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.261873 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerDied","Data":"e120ed22e8152e166ae2c2c84da9216ced3fab7951eb96c4b927b69d6ec6a48f"} Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.262128 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7p47" event={"ID":"d543b8a8-4974-4d2f-980f-c7cffd3fca61","Type":"ContainerDied","Data":"204e1b3ea7eb5ee8b24e7132fb3663a11605a48f574d526518afcfb4b7d9af32"} Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.262148 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204e1b3ea7eb5ee8b24e7132fb3663a11605a48f574d526518afcfb4b7d9af32" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.281119 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.344120 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-catalog-content\") pod \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.344292 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8d4x\" (UniqueName: \"kubernetes.io/projected/d543b8a8-4974-4d2f-980f-c7cffd3fca61-kube-api-access-h8d4x\") pod \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.344343 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-utilities\") pod \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\" (UID: \"d543b8a8-4974-4d2f-980f-c7cffd3fca61\") " Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.346044 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-utilities" (OuterVolumeSpecName: "utilities") pod "d543b8a8-4974-4d2f-980f-c7cffd3fca61" (UID: "d543b8a8-4974-4d2f-980f-c7cffd3fca61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.352780 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d543b8a8-4974-4d2f-980f-c7cffd3fca61-kube-api-access-h8d4x" (OuterVolumeSpecName: "kube-api-access-h8d4x") pod "d543b8a8-4974-4d2f-980f-c7cffd3fca61" (UID: "d543b8a8-4974-4d2f-980f-c7cffd3fca61"). InnerVolumeSpecName "kube-api-access-h8d4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.405366 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d543b8a8-4974-4d2f-980f-c7cffd3fca61" (UID: "d543b8a8-4974-4d2f-980f-c7cffd3fca61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.446444 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.446483 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8d4x\" (UniqueName: \"kubernetes.io/projected/d543b8a8-4974-4d2f-980f-c7cffd3fca61-kube-api-access-h8d4x\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.446492 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543b8a8-4974-4d2f-980f-c7cffd3fca61-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.855487 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-n4kzb"] Dec 01 11:39:52 crc kubenswrapper[4958]: E1201 11:39:52.855919 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="registry-server" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.855962 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="registry-server" Dec 01 11:39:52 crc kubenswrapper[4958]: E1201 11:39:52.856019 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="extract-content" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.856026 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="extract-content" Dec 01 11:39:52 crc kubenswrapper[4958]: E1201 11:39:52.856035 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="extract-utilities" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.856042 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="extract-utilities" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.856239 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" containerName="registry-server" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.857011 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.879242 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n4kzb"] Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.946942 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-44xr7"] Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.948004 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.956351 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-44xr7"] Dec 01 11:39:52 crc kubenswrapper[4958]: I1201 11:39:52.960113 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7pq\" (UniqueName: \"kubernetes.io/projected/0188611e-a44a-49cf-9814-008b7487bb17-kube-api-access-7h7pq\") pod \"nova-api-db-create-n4kzb\" (UID: \"0188611e-a44a-49cf-9814-008b7487bb17\") " pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.061392 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7pq\" (UniqueName: \"kubernetes.io/projected/0188611e-a44a-49cf-9814-008b7487bb17-kube-api-access-7h7pq\") pod \"nova-api-db-create-n4kzb\" (UID: \"0188611e-a44a-49cf-9814-008b7487bb17\") " pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.061494 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85wk\" (UniqueName: \"kubernetes.io/projected/cf85e447-bf63-4cd6-a30c-b70b7bc75868-kube-api-access-b85wk\") pod \"nova-cell0-db-create-44xr7\" (UID: \"cf85e447-bf63-4cd6-a30c-b70b7bc75868\") " pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.079593 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7pq\" (UniqueName: \"kubernetes.io/projected/0188611e-a44a-49cf-9814-008b7487bb17-kube-api-access-7h7pq\") pod \"nova-api-db-create-n4kzb\" (UID: \"0188611e-a44a-49cf-9814-008b7487bb17\") " pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.149785 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xn4w4"] Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.151027 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.160787 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xn4w4"] Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.163121 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b85wk\" (UniqueName: \"kubernetes.io/projected/cf85e447-bf63-4cd6-a30c-b70b7bc75868-kube-api-access-b85wk\") pod \"nova-cell0-db-create-44xr7\" (UID: \"cf85e447-bf63-4cd6-a30c-b70b7bc75868\") " pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.182557 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.183676 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85wk\" (UniqueName: \"kubernetes.io/projected/cf85e447-bf63-4cd6-a30c-b70b7bc75868-kube-api-access-b85wk\") pod \"nova-cell0-db-create-44xr7\" (UID: \"cf85e447-bf63-4cd6-a30c-b70b7bc75868\") " pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.266070 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9slk\" (UniqueName: \"kubernetes.io/projected/df494e10-0447-4b53-a84e-82156db28933-kube-api-access-l9slk\") pod \"nova-cell1-db-create-xn4w4\" (UID: \"df494e10-0447-4b53-a84e-82156db28933\") " pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.276907 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.278333 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7p47" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.367953 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9slk\" (UniqueName: \"kubernetes.io/projected/df494e10-0447-4b53-a84e-82156db28933-kube-api-access-l9slk\") pod \"nova-cell1-db-create-xn4w4\" (UID: \"df494e10-0447-4b53-a84e-82156db28933\") " pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.410232 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9slk\" (UniqueName: \"kubernetes.io/projected/df494e10-0447-4b53-a84e-82156db28933-kube-api-access-l9slk\") pod \"nova-cell1-db-create-xn4w4\" (UID: \"df494e10-0447-4b53-a84e-82156db28933\") " pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.439174 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7p47"] Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.459371 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j7p47"] Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.476893 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.701176 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-n4kzb"] Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.815175 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d543b8a8-4974-4d2f-980f-c7cffd3fca61" path="/var/lib/kubelet/pods/d543b8a8-4974-4d2f-980f-c7cffd3fca61/volumes" Dec 01 11:39:53 crc kubenswrapper[4958]: I1201 11:39:53.899872 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-44xr7"] Dec 01 11:39:53 crc kubenswrapper[4958]: W1201 11:39:53.901370 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf85e447_bf63_4cd6_a30c_b70b7bc75868.slice/crio-66f7391fe808f37ee4310aecd3dab3a5cc50075d803678b3bd51beb419315d7f WatchSource:0}: Error finding container 66f7391fe808f37ee4310aecd3dab3a5cc50075d803678b3bd51beb419315d7f: Status 404 returned error can't find the container with id 66f7391fe808f37ee4310aecd3dab3a5cc50075d803678b3bd51beb419315d7f Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.073598 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xn4w4"] Dec 01 11:39:54 crc kubenswrapper[4958]: W1201 11:39:54.077730 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf494e10_0447_4b53_a84e_82156db28933.slice/crio-1ecd0e4ab0fc65310d10d7eab5ddcf1422dfe48b179f515ab2159dba0497d45c WatchSource:0}: Error finding container 1ecd0e4ab0fc65310d10d7eab5ddcf1422dfe48b179f515ab2159dba0497d45c: Status 404 returned error can't find the container with id 1ecd0e4ab0fc65310d10d7eab5ddcf1422dfe48b179f515ab2159dba0497d45c Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.288112 4958 generic.go:334] "Generic (PLEG): container finished" podID="cf85e447-bf63-4cd6-a30c-b70b7bc75868" containerID="0f323b244b7fdbf654648873ed12dc2b7f7137c4a35fe1ae5adb12f1e7b9fbd1" exitCode=0 Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.288188 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-44xr7" event={"ID":"cf85e447-bf63-4cd6-a30c-b70b7bc75868","Type":"ContainerDied","Data":"0f323b244b7fdbf654648873ed12dc2b7f7137c4a35fe1ae5adb12f1e7b9fbd1"} Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.288220 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-44xr7" event={"ID":"cf85e447-bf63-4cd6-a30c-b70b7bc75868","Type":"ContainerStarted","Data":"66f7391fe808f37ee4310aecd3dab3a5cc50075d803678b3bd51beb419315d7f"} Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.290325 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xn4w4" event={"ID":"df494e10-0447-4b53-a84e-82156db28933","Type":"ContainerStarted","Data":"1ecd0e4ab0fc65310d10d7eab5ddcf1422dfe48b179f515ab2159dba0497d45c"} Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.292317 4958 generic.go:334] "Generic (PLEG): container finished" podID="0188611e-a44a-49cf-9814-008b7487bb17" containerID="f094f2600ff580778a632f53749dc734c2d087e367f8413417ecc16f7c5a5fdf" exitCode=0 Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.292356 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4kzb" event={"ID":"0188611e-a44a-49cf-9814-008b7487bb17","Type":"ContainerDied","Data":"f094f2600ff580778a632f53749dc734c2d087e367f8413417ecc16f7c5a5fdf"} Dec 01 11:39:54 crc kubenswrapper[4958]: I1201 11:39:54.292442 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4kzb" event={"ID":"0188611e-a44a-49cf-9814-008b7487bb17","Type":"ContainerStarted","Data":"88aef071085664357e6c1428cc2740c9d750367d7aede9ac0265e8f01db9f511"} Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.302451 4958 generic.go:334] "Generic (PLEG): container finished" podID="df494e10-0447-4b53-a84e-82156db28933" containerID="bf9115d7ab3438479e0b01378564de2247c19ce96890443d01f3774d43973316" exitCode=0 Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.302609 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xn4w4" event={"ID":"df494e10-0447-4b53-a84e-82156db28933","Type":"ContainerDied","Data":"bf9115d7ab3438479e0b01378564de2247c19ce96890443d01f3774d43973316"} Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.689380 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.769038 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.850324 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h7pq\" (UniqueName: \"kubernetes.io/projected/0188611e-a44a-49cf-9814-008b7487bb17-kube-api-access-7h7pq\") pod \"0188611e-a44a-49cf-9814-008b7487bb17\" (UID: \"0188611e-a44a-49cf-9814-008b7487bb17\") " Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.856802 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0188611e-a44a-49cf-9814-008b7487bb17-kube-api-access-7h7pq" (OuterVolumeSpecName: "kube-api-access-7h7pq") pod "0188611e-a44a-49cf-9814-008b7487bb17" (UID: "0188611e-a44a-49cf-9814-008b7487bb17"). InnerVolumeSpecName "kube-api-access-7h7pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.953295 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b85wk\" (UniqueName: \"kubernetes.io/projected/cf85e447-bf63-4cd6-a30c-b70b7bc75868-kube-api-access-b85wk\") pod \"cf85e447-bf63-4cd6-a30c-b70b7bc75868\" (UID: \"cf85e447-bf63-4cd6-a30c-b70b7bc75868\") " Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.953955 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h7pq\" (UniqueName: \"kubernetes.io/projected/0188611e-a44a-49cf-9814-008b7487bb17-kube-api-access-7h7pq\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:55 crc kubenswrapper[4958]: I1201 11:39:55.956968 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf85e447-bf63-4cd6-a30c-b70b7bc75868-kube-api-access-b85wk" (OuterVolumeSpecName: "kube-api-access-b85wk") pod "cf85e447-bf63-4cd6-a30c-b70b7bc75868" (UID: "cf85e447-bf63-4cd6-a30c-b70b7bc75868"). InnerVolumeSpecName "kube-api-access-b85wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.055597 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b85wk\" (UniqueName: \"kubernetes.io/projected/cf85e447-bf63-4cd6-a30c-b70b7bc75868-kube-api-access-b85wk\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.315531 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-44xr7" event={"ID":"cf85e447-bf63-4cd6-a30c-b70b7bc75868","Type":"ContainerDied","Data":"66f7391fe808f37ee4310aecd3dab3a5cc50075d803678b3bd51beb419315d7f"} Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.316343 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f7391fe808f37ee4310aecd3dab3a5cc50075d803678b3bd51beb419315d7f" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.315587 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-44xr7" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.319442 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-n4kzb" event={"ID":"0188611e-a44a-49cf-9814-008b7487bb17","Type":"ContainerDied","Data":"88aef071085664357e6c1428cc2740c9d750367d7aede9ac0265e8f01db9f511"} Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.319696 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-n4kzb" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.319617 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88aef071085664357e6c1428cc2740c9d750367d7aede9ac0265e8f01db9f511" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.827471 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.971687 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9slk\" (UniqueName: \"kubernetes.io/projected/df494e10-0447-4b53-a84e-82156db28933-kube-api-access-l9slk\") pod \"df494e10-0447-4b53-a84e-82156db28933\" (UID: \"df494e10-0447-4b53-a84e-82156db28933\") " Dec 01 11:39:56 crc kubenswrapper[4958]: I1201 11:39:56.978071 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df494e10-0447-4b53-a84e-82156db28933-kube-api-access-l9slk" (OuterVolumeSpecName: "kube-api-access-l9slk") pod "df494e10-0447-4b53-a84e-82156db28933" (UID: "df494e10-0447-4b53-a84e-82156db28933"). InnerVolumeSpecName "kube-api-access-l9slk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:39:57 crc kubenswrapper[4958]: I1201 11:39:57.073886 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9slk\" (UniqueName: \"kubernetes.io/projected/df494e10-0447-4b53-a84e-82156db28933-kube-api-access-l9slk\") on node \"crc\" DevicePath \"\"" Dec 01 11:39:57 crc kubenswrapper[4958]: I1201 11:39:57.348131 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xn4w4" event={"ID":"df494e10-0447-4b53-a84e-82156db28933","Type":"ContainerDied","Data":"1ecd0e4ab0fc65310d10d7eab5ddcf1422dfe48b179f515ab2159dba0497d45c"} Dec 01 11:39:57 crc kubenswrapper[4958]: I1201 11:39:57.348202 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ecd0e4ab0fc65310d10d7eab5ddcf1422dfe48b179f515ab2159dba0497d45c" Dec 01 11:39:57 crc kubenswrapper[4958]: I1201 11:39:57.348202 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xn4w4" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.106261 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-473c-account-create-cc2kr"] Dec 01 11:40:03 crc kubenswrapper[4958]: E1201 11:40:03.107747 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0188611e-a44a-49cf-9814-008b7487bb17" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.107786 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0188611e-a44a-49cf-9814-008b7487bb17" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: E1201 11:40:03.107827 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf85e447-bf63-4cd6-a30c-b70b7bc75868" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.107883 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf85e447-bf63-4cd6-a30c-b70b7bc75868" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: E1201 11:40:03.107945 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df494e10-0447-4b53-a84e-82156db28933" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.107966 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="df494e10-0447-4b53-a84e-82156db28933" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.108447 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0188611e-a44a-49cf-9814-008b7487bb17" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.108498 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="df494e10-0447-4b53-a84e-82156db28933" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.108521 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf85e447-bf63-4cd6-a30c-b70b7bc75868" containerName="mariadb-database-create" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.109931 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.113116 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.119180 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-473c-account-create-cc2kr"] Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.255949 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd7bq\" (UniqueName: \"kubernetes.io/projected/72d863b4-7434-4304-9098-ef73c392c23f-kube-api-access-vd7bq\") pod \"nova-api-473c-account-create-cc2kr\" (UID: \"72d863b4-7434-4304-9098-ef73c392c23f\") " pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.293018 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-aafc-account-create-w2pcf"] Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.294427 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.300344 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.306668 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aafc-account-create-w2pcf"] Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.357346 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd7bq\" (UniqueName: \"kubernetes.io/projected/72d863b4-7434-4304-9098-ef73c392c23f-kube-api-access-vd7bq\") pod \"nova-api-473c-account-create-cc2kr\" (UID: \"72d863b4-7434-4304-9098-ef73c392c23f\") " pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.380403 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd7bq\" (UniqueName: \"kubernetes.io/projected/72d863b4-7434-4304-9098-ef73c392c23f-kube-api-access-vd7bq\") pod \"nova-api-473c-account-create-cc2kr\" (UID: \"72d863b4-7434-4304-9098-ef73c392c23f\") " pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.447679 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.459348 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm8h4\" (UniqueName: \"kubernetes.io/projected/ea5407f0-84dc-4fec-aec1-c744e117ef1c-kube-api-access-rm8h4\") pod \"nova-cell0-aafc-account-create-w2pcf\" (UID: \"ea5407f0-84dc-4fec-aec1-c744e117ef1c\") " pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.493085 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cad8-account-create-rffzk"] Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.494757 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.497463 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.501522 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cad8-account-create-rffzk"] Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.560765 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm8h4\" (UniqueName: \"kubernetes.io/projected/ea5407f0-84dc-4fec-aec1-c744e117ef1c-kube-api-access-rm8h4\") pod \"nova-cell0-aafc-account-create-w2pcf\" (UID: \"ea5407f0-84dc-4fec-aec1-c744e117ef1c\") " pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.595067 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm8h4\" (UniqueName: \"kubernetes.io/projected/ea5407f0-84dc-4fec-aec1-c744e117ef1c-kube-api-access-rm8h4\") pod \"nova-cell0-aafc-account-create-w2pcf\" (UID: \"ea5407f0-84dc-4fec-aec1-c744e117ef1c\") " pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.614640 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.662728 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zn54\" (UniqueName: \"kubernetes.io/projected/798fedf2-f691-4307-b222-993c0e027046-kube-api-access-9zn54\") pod \"nova-cell1-cad8-account-create-rffzk\" (UID: \"798fedf2-f691-4307-b222-993c0e027046\") " pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.765455 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zn54\" (UniqueName: \"kubernetes.io/projected/798fedf2-f691-4307-b222-993c0e027046-kube-api-access-9zn54\") pod \"nova-cell1-cad8-account-create-rffzk\" (UID: \"798fedf2-f691-4307-b222-993c0e027046\") " pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.782106 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zn54\" (UniqueName: \"kubernetes.io/projected/798fedf2-f691-4307-b222-993c0e027046-kube-api-access-9zn54\") pod \"nova-cell1-cad8-account-create-rffzk\" (UID: \"798fedf2-f691-4307-b222-993c0e027046\") " pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.867313 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:03 crc kubenswrapper[4958]: I1201 11:40:03.896359 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-473c-account-create-cc2kr"] Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.074244 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-aafc-account-create-w2pcf"] Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.383143 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cad8-account-create-rffzk"] Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.440966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cad8-account-create-rffzk" event={"ID":"798fedf2-f691-4307-b222-993c0e027046","Type":"ContainerStarted","Data":"76132ef6c69d8b02a4237412045505be6b69071fad8e9bdd83252c3066ba0752"} Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.443024 4958 generic.go:334] "Generic (PLEG): container finished" podID="72d863b4-7434-4304-9098-ef73c392c23f" containerID="a4b2d5dc2a4b399dad04c44e02d31b99c1d70e35410e0cdd9b3fdc0a7890002c" exitCode=0 Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.443117 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-473c-account-create-cc2kr" event={"ID":"72d863b4-7434-4304-9098-ef73c392c23f","Type":"ContainerDied","Data":"a4b2d5dc2a4b399dad04c44e02d31b99c1d70e35410e0cdd9b3fdc0a7890002c"} Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.443147 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-473c-account-create-cc2kr" event={"ID":"72d863b4-7434-4304-9098-ef73c392c23f","Type":"ContainerStarted","Data":"be6d0314c966c33d28d7adfd168d3be1178baf9c41fd984f9c22ed42cf26cbd8"} Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.448160 4958 generic.go:334] "Generic (PLEG): container finished" podID="ea5407f0-84dc-4fec-aec1-c744e117ef1c" containerID="b478729ae7239b1c67a9e1c302ebc35bfb6c533488e0480f346bdf83f7db827b" exitCode=0 Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.448226 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aafc-account-create-w2pcf" event={"ID":"ea5407f0-84dc-4fec-aec1-c744e117ef1c","Type":"ContainerDied","Data":"b478729ae7239b1c67a9e1c302ebc35bfb6c533488e0480f346bdf83f7db827b"} Dec 01 11:40:04 crc kubenswrapper[4958]: I1201 11:40:04.448266 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aafc-account-create-w2pcf" event={"ID":"ea5407f0-84dc-4fec-aec1-c744e117ef1c","Type":"ContainerStarted","Data":"f800e11225773925cf1e4a9e9fd23f368c082df1774dfca957f8e0ec9108b225"} Dec 01 11:40:05 crc kubenswrapper[4958]: I1201 11:40:05.465209 4958 generic.go:334] "Generic (PLEG): container finished" podID="798fedf2-f691-4307-b222-993c0e027046" containerID="0e4dbb913579e44799bcfd95f28a0a272cd47aeee85f76dcaf8ffcf18e33d4f6" exitCode=0 Dec 01 11:40:05 crc kubenswrapper[4958]: I1201 11:40:05.465309 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cad8-account-create-rffzk" event={"ID":"798fedf2-f691-4307-b222-993c0e027046","Type":"ContainerDied","Data":"0e4dbb913579e44799bcfd95f28a0a272cd47aeee85f76dcaf8ffcf18e33d4f6"} Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.050936 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.056353 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.155736 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd7bq\" (UniqueName: \"kubernetes.io/projected/72d863b4-7434-4304-9098-ef73c392c23f-kube-api-access-vd7bq\") pod \"72d863b4-7434-4304-9098-ef73c392c23f\" (UID: \"72d863b4-7434-4304-9098-ef73c392c23f\") " Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.155942 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm8h4\" (UniqueName: \"kubernetes.io/projected/ea5407f0-84dc-4fec-aec1-c744e117ef1c-kube-api-access-rm8h4\") pod \"ea5407f0-84dc-4fec-aec1-c744e117ef1c\" (UID: \"ea5407f0-84dc-4fec-aec1-c744e117ef1c\") " Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.162328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d863b4-7434-4304-9098-ef73c392c23f-kube-api-access-vd7bq" (OuterVolumeSpecName: "kube-api-access-vd7bq") pod "72d863b4-7434-4304-9098-ef73c392c23f" (UID: "72d863b4-7434-4304-9098-ef73c392c23f"). InnerVolumeSpecName "kube-api-access-vd7bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.164379 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea5407f0-84dc-4fec-aec1-c744e117ef1c-kube-api-access-rm8h4" (OuterVolumeSpecName: "kube-api-access-rm8h4") pod "ea5407f0-84dc-4fec-aec1-c744e117ef1c" (UID: "ea5407f0-84dc-4fec-aec1-c744e117ef1c"). InnerVolumeSpecName "kube-api-access-rm8h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.258600 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd7bq\" (UniqueName: \"kubernetes.io/projected/72d863b4-7434-4304-9098-ef73c392c23f-kube-api-access-vd7bq\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.258651 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm8h4\" (UniqueName: \"kubernetes.io/projected/ea5407f0-84dc-4fec-aec1-c744e117ef1c-kube-api-access-rm8h4\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.475937 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-473c-account-create-cc2kr" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.475962 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-473c-account-create-cc2kr" event={"ID":"72d863b4-7434-4304-9098-ef73c392c23f","Type":"ContainerDied","Data":"be6d0314c966c33d28d7adfd168d3be1178baf9c41fd984f9c22ed42cf26cbd8"} Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.476299 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be6d0314c966c33d28d7adfd168d3be1178baf9c41fd984f9c22ed42cf26cbd8" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.478365 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-aafc-account-create-w2pcf" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.478361 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-aafc-account-create-w2pcf" event={"ID":"ea5407f0-84dc-4fec-aec1-c744e117ef1c","Type":"ContainerDied","Data":"f800e11225773925cf1e4a9e9fd23f368c082df1774dfca957f8e0ec9108b225"} Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.478419 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f800e11225773925cf1e4a9e9fd23f368c082df1774dfca957f8e0ec9108b225" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.818167 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.880673 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zn54\" (UniqueName: \"kubernetes.io/projected/798fedf2-f691-4307-b222-993c0e027046-kube-api-access-9zn54\") pod \"798fedf2-f691-4307-b222-993c0e027046\" (UID: \"798fedf2-f691-4307-b222-993c0e027046\") " Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.886129 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798fedf2-f691-4307-b222-993c0e027046-kube-api-access-9zn54" (OuterVolumeSpecName: "kube-api-access-9zn54") pod "798fedf2-f691-4307-b222-993c0e027046" (UID: "798fedf2-f691-4307-b222-993c0e027046"). InnerVolumeSpecName "kube-api-access-9zn54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:06 crc kubenswrapper[4958]: I1201 11:40:06.984877 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zn54\" (UniqueName: \"kubernetes.io/projected/798fedf2-f691-4307-b222-993c0e027046-kube-api-access-9zn54\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:07 crc kubenswrapper[4958]: I1201 11:40:07.490498 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cad8-account-create-rffzk" event={"ID":"798fedf2-f691-4307-b222-993c0e027046","Type":"ContainerDied","Data":"76132ef6c69d8b02a4237412045505be6b69071fad8e9bdd83252c3066ba0752"} Dec 01 11:40:07 crc kubenswrapper[4958]: I1201 11:40:07.490542 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76132ef6c69d8b02a4237412045505be6b69071fad8e9bdd83252c3066ba0752" Dec 01 11:40:07 crc kubenswrapper[4958]: I1201 11:40:07.490577 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cad8-account-create-rffzk" Dec 01 11:40:07 crc kubenswrapper[4958]: E1201 11:40:07.619403 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod798fedf2_f691_4307_b222_993c0e027046.slice/crio-76132ef6c69d8b02a4237412045505be6b69071fad8e9bdd83252c3066ba0752\": RecentStats: unable to find data in memory cache]" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559111 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xswp8"] Dec 01 11:40:08 crc kubenswrapper[4958]: E1201 11:40:08.559705 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798fedf2-f691-4307-b222-993c0e027046" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559716 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="798fedf2-f691-4307-b222-993c0e027046" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: E1201 11:40:08.559746 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea5407f0-84dc-4fec-aec1-c744e117ef1c" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559753 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea5407f0-84dc-4fec-aec1-c744e117ef1c" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: E1201 11:40:08.559761 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d863b4-7434-4304-9098-ef73c392c23f" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559767 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d863b4-7434-4304-9098-ef73c392c23f" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559969 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea5407f0-84dc-4fec-aec1-c744e117ef1c" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559987 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d863b4-7434-4304-9098-ef73c392c23f" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.559997 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="798fedf2-f691-4307-b222-993c0e027046" containerName="mariadb-account-create" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.560575 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.565260 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.565315 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8pxh5" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.565505 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.571373 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xswp8"] Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.620008 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.620087 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-config-data\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.620138 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-scripts\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.620221 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmxc\" (UniqueName: \"kubernetes.io/projected/f956bfc8-32c9-4b41-877c-f80d520c011b-kube-api-access-xxmxc\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.721752 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmxc\" (UniqueName: \"kubernetes.io/projected/f956bfc8-32c9-4b41-877c-f80d520c011b-kube-api-access-xxmxc\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.721895 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.721937 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-config-data\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.721972 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-scripts\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.728256 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-scripts\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.729524 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.729667 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-config-data\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.739601 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmxc\" (UniqueName: \"kubernetes.io/projected/f956bfc8-32c9-4b41-877c-f80d520c011b-kube-api-access-xxmxc\") pod \"nova-cell0-conductor-db-sync-xswp8\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:08 crc kubenswrapper[4958]: I1201 11:40:08.924023 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:09 crc kubenswrapper[4958]: I1201 11:40:09.471700 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xswp8"] Dec 01 11:40:09 crc kubenswrapper[4958]: I1201 11:40:09.511741 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xswp8" event={"ID":"f956bfc8-32c9-4b41-877c-f80d520c011b","Type":"ContainerStarted","Data":"60573fcffaa39fef4acdece75b510bd2bc216ab5dee5768774ddaf0c6aa278f9"} Dec 01 11:40:10 crc kubenswrapper[4958]: I1201 11:40:10.527490 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xswp8" event={"ID":"f956bfc8-32c9-4b41-877c-f80d520c011b","Type":"ContainerStarted","Data":"c91efe3c9d3923d2f69c367f80b292f958b5f6877131e3708b7e876c94e6cf2e"} Dec 01 11:40:10 crc kubenswrapper[4958]: I1201 11:40:10.560720 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-xswp8" podStartSLOduration=2.560690032 podStartE2EDuration="2.560690032s" podCreationTimestamp="2025-12-01 11:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:10.550652558 +0000 UTC m=+6058.059441675" watchObservedRunningTime="2025-12-01 11:40:10.560690032 +0000 UTC m=+6058.069479099" Dec 01 11:40:15 crc kubenswrapper[4958]: I1201 11:40:15.589337 4958 generic.go:334] "Generic (PLEG): container finished" podID="f956bfc8-32c9-4b41-877c-f80d520c011b" containerID="c91efe3c9d3923d2f69c367f80b292f958b5f6877131e3708b7e876c94e6cf2e" exitCode=0 Dec 01 11:40:15 crc kubenswrapper[4958]: I1201 11:40:15.589491 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xswp8" event={"ID":"f956bfc8-32c9-4b41-877c-f80d520c011b","Type":"ContainerDied","Data":"c91efe3c9d3923d2f69c367f80b292f958b5f6877131e3708b7e876c94e6cf2e"} Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.065881 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.105622 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmxc\" (UniqueName: \"kubernetes.io/projected/f956bfc8-32c9-4b41-877c-f80d520c011b-kube-api-access-xxmxc\") pod \"f956bfc8-32c9-4b41-877c-f80d520c011b\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.105711 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-scripts\") pod \"f956bfc8-32c9-4b41-877c-f80d520c011b\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.105766 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-combined-ca-bundle\") pod \"f956bfc8-32c9-4b41-877c-f80d520c011b\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.105861 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-config-data\") pod \"f956bfc8-32c9-4b41-877c-f80d520c011b\" (UID: \"f956bfc8-32c9-4b41-877c-f80d520c011b\") " Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.143286 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-scripts" (OuterVolumeSpecName: "scripts") pod "f956bfc8-32c9-4b41-877c-f80d520c011b" (UID: "f956bfc8-32c9-4b41-877c-f80d520c011b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.143411 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f956bfc8-32c9-4b41-877c-f80d520c011b-kube-api-access-xxmxc" (OuterVolumeSpecName: "kube-api-access-xxmxc") pod "f956bfc8-32c9-4b41-877c-f80d520c011b" (UID: "f956bfc8-32c9-4b41-877c-f80d520c011b"). InnerVolumeSpecName "kube-api-access-xxmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.152939 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-config-data" (OuterVolumeSpecName: "config-data") pod "f956bfc8-32c9-4b41-877c-f80d520c011b" (UID: "f956bfc8-32c9-4b41-877c-f80d520c011b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.192127 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f956bfc8-32c9-4b41-877c-f80d520c011b" (UID: "f956bfc8-32c9-4b41-877c-f80d520c011b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.207642 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmxc\" (UniqueName: \"kubernetes.io/projected/f956bfc8-32c9-4b41-877c-f80d520c011b-kube-api-access-xxmxc\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.207684 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.207697 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.207733 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f956bfc8-32c9-4b41-877c-f80d520c011b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.618935 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xswp8" event={"ID":"f956bfc8-32c9-4b41-877c-f80d520c011b","Type":"ContainerDied","Data":"60573fcffaa39fef4acdece75b510bd2bc216ab5dee5768774ddaf0c6aa278f9"} Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.619028 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60573fcffaa39fef4acdece75b510bd2bc216ab5dee5768774ddaf0c6aa278f9" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.619575 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xswp8" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.792408 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:40:17 crc kubenswrapper[4958]: E1201 11:40:17.793090 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f956bfc8-32c9-4b41-877c-f80d520c011b" containerName="nova-cell0-conductor-db-sync" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.793203 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f956bfc8-32c9-4b41-877c-f80d520c011b" containerName="nova-cell0-conductor-db-sync" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.793445 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f956bfc8-32c9-4b41-877c-f80d520c011b" containerName="nova-cell0-conductor-db-sync" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.794287 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.799140 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8pxh5" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.800255 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.827927 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.839666 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjsr\" (UniqueName: \"kubernetes.io/projected/e20bdad3-ef47-4629-8812-d4c6f7345279-kube-api-access-zsjsr\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.839812 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.839888 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.941658 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjsr\" (UniqueName: \"kubernetes.io/projected/e20bdad3-ef47-4629-8812-d4c6f7345279-kube-api-access-zsjsr\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.941790 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.941879 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.947204 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.952235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:17 crc kubenswrapper[4958]: I1201 11:40:17.961606 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjsr\" (UniqueName: \"kubernetes.io/projected/e20bdad3-ef47-4629-8812-d4c6f7345279-kube-api-access-zsjsr\") pod \"nova-cell0-conductor-0\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:18 crc kubenswrapper[4958]: I1201 11:40:18.131213 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:18 crc kubenswrapper[4958]: I1201 11:40:18.681238 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:40:18 crc kubenswrapper[4958]: W1201 11:40:18.688494 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20bdad3_ef47_4629_8812_d4c6f7345279.slice/crio-3cf5c54a2baf442ba63f2a6cfe0f27e05dcb1ed809cd0aa401c54454d1c76a68 WatchSource:0}: Error finding container 3cf5c54a2baf442ba63f2a6cfe0f27e05dcb1ed809cd0aa401c54454d1c76a68: Status 404 returned error can't find the container with id 3cf5c54a2baf442ba63f2a6cfe0f27e05dcb1ed809cd0aa401c54454d1c76a68 Dec 01 11:40:19 crc kubenswrapper[4958]: I1201 11:40:19.644418 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e20bdad3-ef47-4629-8812-d4c6f7345279","Type":"ContainerStarted","Data":"9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91"} Dec 01 11:40:19 crc kubenswrapper[4958]: I1201 11:40:19.645031 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:19 crc kubenswrapper[4958]: I1201 11:40:19.645052 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e20bdad3-ef47-4629-8812-d4c6f7345279","Type":"ContainerStarted","Data":"3cf5c54a2baf442ba63f2a6cfe0f27e05dcb1ed809cd0aa401c54454d1c76a68"} Dec 01 11:40:19 crc kubenswrapper[4958]: I1201 11:40:19.667722 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.6676937179999998 podStartE2EDuration="2.667693718s" podCreationTimestamp="2025-12-01 11:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:19.663418526 +0000 UTC m=+6067.172207653" watchObservedRunningTime="2025-12-01 11:40:19.667693718 +0000 UTC m=+6067.176482785" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.187742 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.720907 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gmblw"] Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.722673 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.725674 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.730310 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.750675 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gmblw"] Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.865956 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.866042 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-scripts\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.866106 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9txf\" (UniqueName: \"kubernetes.io/projected/0b385e57-4fa1-4d10-afef-2cd942607c11-kube-api-access-p9txf\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.866128 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-config-data\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.874039 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.876125 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.881939 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.910477 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.969831 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-scripts\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970241 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szb74\" (UniqueName: \"kubernetes.io/projected/22070023-b108-4113-91e1-51c7cde2c3d9-kube-api-access-szb74\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970300 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9txf\" (UniqueName: \"kubernetes.io/projected/0b385e57-4fa1-4d10-afef-2cd942607c11-kube-api-access-p9txf\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970324 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-config-data\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970365 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-config-data\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970413 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22070023-b108-4113-91e1-51c7cde2c3d9-logs\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970490 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.970543 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.985089 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-config-data\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:23 crc kubenswrapper[4958]: I1201 11:40:23.987121 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:23.999051 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.001442 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-scripts\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.001611 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.004649 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.013906 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.028902 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9txf\" (UniqueName: \"kubernetes.io/projected/0b385e57-4fa1-4d10-afef-2cd942607c11-kube-api-access-p9txf\") pod \"nova-cell0-cell-mapping-gmblw\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.050737 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.051926 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.054769 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.061441 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.071588 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-config-data\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.071649 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22070023-b108-4113-91e1-51c7cde2c3d9-logs\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.071687 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.071769 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szb74\" (UniqueName: \"kubernetes.io/projected/22070023-b108-4113-91e1-51c7cde2c3d9-kube-api-access-szb74\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.078206 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22070023-b108-4113-91e1-51c7cde2c3d9-logs\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.084024 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.097912 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-config-data\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.098753 4958 scope.go:117] "RemoveContainer" containerID="39460ecd53f06e7a2d28c12c68cdceca45b831240cc69bd129fd9ec640c469b2" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.099534 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.107480 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szb74\" (UniqueName: \"kubernetes.io/projected/22070023-b108-4113-91e1-51c7cde2c3d9-kube-api-access-szb74\") pod \"nova-api-0\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.145506 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-968fc5485-pmcmj"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.147171 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.158789 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-968fc5485-pmcmj"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.173901 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5x8\" (UniqueName: \"kubernetes.io/projected/dfcf83f3-b3be-49b5-9719-57857e262839-kube-api-access-vn5x8\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.173967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.174014 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.174054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcf83f3-b3be-49b5-9719-57857e262839-logs\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.174121 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d725w\" (UniqueName: \"kubernetes.io/projected/f5fd96ac-8e63-438e-8f66-8435513697c9-kube-api-access-d725w\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.174146 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-config-data\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.174425 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.189999 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.197275 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.212232 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.214425 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.221507 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.275972 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.276032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcf83f3-b3be-49b5-9719-57857e262839-logs\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.276081 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-sb\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.276117 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d725w\" (UniqueName: \"kubernetes.io/projected/f5fd96ac-8e63-438e-8f66-8435513697c9-kube-api-access-d725w\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.276141 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-config-data\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.277599 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcf83f3-b3be-49b5-9719-57857e262839-logs\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.277711 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-config\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.277806 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.277988 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2tf\" (UniqueName: \"kubernetes.io/projected/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-kube-api-access-tf2tf\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.278111 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5x8\" (UniqueName: \"kubernetes.io/projected/dfcf83f3-b3be-49b5-9719-57857e262839-kube-api-access-vn5x8\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.278135 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-nb\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.278187 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.278206 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-dns-svc\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.283206 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.283913 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.290014 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.293511 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-config-data\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.297359 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5x8\" (UniqueName: \"kubernetes.io/projected/dfcf83f3-b3be-49b5-9719-57857e262839-kube-api-access-vn5x8\") pod \"nova-metadata-0\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.298771 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d725w\" (UniqueName: \"kubernetes.io/projected/f5fd96ac-8e63-438e-8f66-8435513697c9-kube-api-access-d725w\") pod \"nova-cell1-novncproxy-0\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380325 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dxcs\" (UniqueName: \"kubernetes.io/projected/12a786d1-2169-4b70-a6df-781fa8fad94c-kube-api-access-7dxcs\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380407 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-sb\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380465 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380499 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-config-data\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380536 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-config\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380588 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2tf\" (UniqueName: \"kubernetes.io/projected/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-kube-api-access-tf2tf\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380638 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-nb\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.380670 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-dns-svc\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.382292 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-config\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.382489 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-nb\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.383284 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-sb\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.385181 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-dns-svc\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.403933 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2tf\" (UniqueName: \"kubernetes.io/projected/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-kube-api-access-tf2tf\") pod \"dnsmasq-dns-968fc5485-pmcmj\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.456974 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gmblw"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.481979 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dxcs\" (UniqueName: \"kubernetes.io/projected/12a786d1-2169-4b70-a6df-781fa8fad94c-kube-api-access-7dxcs\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.482057 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.482087 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-config-data\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.485682 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-config-data\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.487568 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.499730 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.522754 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.525831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dxcs\" (UniqueName: \"kubernetes.io/projected/12a786d1-2169-4b70-a6df-781fa8fad94c-kube-api-access-7dxcs\") pod \"nova-scheduler-0\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.548404 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.559620 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.565302 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:24 crc kubenswrapper[4958]: W1201 11:40:24.605395 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22070023_b108_4113_91e1_51c7cde2c3d9.slice/crio-d6ca57e38864b4fbf9451e01165d303195c0998e41660e4729aab2435cd93e87 WatchSource:0}: Error finding container d6ca57e38864b4fbf9451e01165d303195c0998e41660e4729aab2435cd93e87: Status 404 returned error can't find the container with id d6ca57e38864b4fbf9451e01165d303195c0998e41660e4729aab2435cd93e87 Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.709193 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22070023-b108-4113-91e1-51c7cde2c3d9","Type":"ContainerStarted","Data":"d6ca57e38864b4fbf9451e01165d303195c0998e41660e4729aab2435cd93e87"} Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.720220 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gmblw" event={"ID":"0b385e57-4fa1-4d10-afef-2cd942607c11","Type":"ContainerStarted","Data":"215f0616431a98c2cb2fc8d89ae596aab07c48861a10b2ea35a33058e8dcc2ab"} Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.720262 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gmblw" event={"ID":"0b385e57-4fa1-4d10-afef-2cd942607c11","Type":"ContainerStarted","Data":"8ca1307c04062d66045c09ddafc34d6829b6b3f3777f099f3bbba151af1bc62d"} Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.751386 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gmblw" podStartSLOduration=1.751366231 podStartE2EDuration="1.751366231s" podCreationTimestamp="2025-12-01 11:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:24.739819493 +0000 UTC m=+6072.248608540" watchObservedRunningTime="2025-12-01 11:40:24.751366231 +0000 UTC m=+6072.260155268" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.945398 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pdwfr"] Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.947435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.950257 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.950274 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 11:40:24 crc kubenswrapper[4958]: I1201 11:40:24.956157 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pdwfr"] Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.040825 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.119694 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hnn\" (UniqueName: \"kubernetes.io/projected/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-kube-api-access-f4hnn\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.120029 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.120052 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-config-data\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.120142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-scripts\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: W1201 11:40:25.149687 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5fd96ac_8e63_438e_8f66_8435513697c9.slice/crio-3858d62464f0b1fbbeb2455752165250fe1dd8c46ae34ad79da272e77cca4ada WatchSource:0}: Error finding container 3858d62464f0b1fbbeb2455752165250fe1dd8c46ae34ad79da272e77cca4ada: Status 404 returned error can't find the container with id 3858d62464f0b1fbbeb2455752165250fe1dd8c46ae34ad79da272e77cca4ada Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.152694 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.221739 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-scripts\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.223133 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hnn\" (UniqueName: \"kubernetes.io/projected/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-kube-api-access-f4hnn\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.223216 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.223251 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-config-data\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.229593 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-scripts\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.229893 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-config-data\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.235955 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.245176 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hnn\" (UniqueName: \"kubernetes.io/projected/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-kube-api-access-f4hnn\") pod \"nova-cell1-conductor-db-sync-pdwfr\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.269136 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:25 crc kubenswrapper[4958]: W1201 11:40:25.270813 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12a786d1_2169_4b70_a6df_781fa8fad94c.slice/crio-91677e17c6220831819723342c8c51b64331427499c978325bd7ec0fd804a192 WatchSource:0}: Error finding container 91677e17c6220831819723342c8c51b64331427499c978325bd7ec0fd804a192: Status 404 returned error can't find the container with id 91677e17c6220831819723342c8c51b64331427499c978325bd7ec0fd804a192 Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.276623 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.295594 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-968fc5485-pmcmj"] Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.742578 4958 generic.go:334] "Generic (PLEG): container finished" podID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerID="1ce18d6c4974ac1fc3ebf61900ee041e32d6b9c9487c5ddc8f3f6cc61d321859" exitCode=0 Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.742671 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" event={"ID":"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb","Type":"ContainerDied","Data":"1ce18d6c4974ac1fc3ebf61900ee041e32d6b9c9487c5ddc8f3f6cc61d321859"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.743571 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" event={"ID":"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb","Type":"ContainerStarted","Data":"cc825789a70c692c03f17ba320f7129616f409d5599296a849d52fb6094f5b3c"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.748835 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f5fd96ac-8e63-438e-8f66-8435513697c9","Type":"ContainerStarted","Data":"ca832a8e91d28ac78ad28b166b656a88ef9ac2b9b6f8713ff67ab107f0e7be2d"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.748907 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f5fd96ac-8e63-438e-8f66-8435513697c9","Type":"ContainerStarted","Data":"3858d62464f0b1fbbeb2455752165250fe1dd8c46ae34ad79da272e77cca4ada"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.754985 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dfcf83f3-b3be-49b5-9719-57857e262839","Type":"ContainerStarted","Data":"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.755023 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dfcf83f3-b3be-49b5-9719-57857e262839","Type":"ContainerStarted","Data":"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.755032 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dfcf83f3-b3be-49b5-9719-57857e262839","Type":"ContainerStarted","Data":"f527802b71817c780436eeb025faf315c97de48279003b2f143d9f15c119e828"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.757147 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12a786d1-2169-4b70-a6df-781fa8fad94c","Type":"ContainerStarted","Data":"f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.757173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12a786d1-2169-4b70-a6df-781fa8fad94c","Type":"ContainerStarted","Data":"91677e17c6220831819723342c8c51b64331427499c978325bd7ec0fd804a192"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.760439 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22070023-b108-4113-91e1-51c7cde2c3d9","Type":"ContainerStarted","Data":"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.760464 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22070023-b108-4113-91e1-51c7cde2c3d9","Type":"ContainerStarted","Data":"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403"} Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.773164 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pdwfr"] Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.803108 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.803090996 podStartE2EDuration="2.803090996s" podCreationTimestamp="2025-12-01 11:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:25.783873241 +0000 UTC m=+6073.292662278" watchObservedRunningTime="2025-12-01 11:40:25.803090996 +0000 UTC m=+6073.311880033" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.807002 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.806990697 podStartE2EDuration="1.806990697s" podCreationTimestamp="2025-12-01 11:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:25.797393085 +0000 UTC m=+6073.306182132" watchObservedRunningTime="2025-12-01 11:40:25.806990697 +0000 UTC m=+6073.315779734" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.822831 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8228036359999997 podStartE2EDuration="2.822803636s" podCreationTimestamp="2025-12-01 11:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:25.817174896 +0000 UTC m=+6073.325963963" watchObservedRunningTime="2025-12-01 11:40:25.822803636 +0000 UTC m=+6073.331592663" Dec 01 11:40:25 crc kubenswrapper[4958]: I1201 11:40:25.855442 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.855420982 podStartE2EDuration="2.855420982s" podCreationTimestamp="2025-12-01 11:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:25.832238844 +0000 UTC m=+6073.341027881" watchObservedRunningTime="2025-12-01 11:40:25.855420982 +0000 UTC m=+6073.364210019" Dec 01 11:40:26 crc kubenswrapper[4958]: I1201 11:40:26.774871 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" event={"ID":"69f0027d-f2a8-48b3-8e17-cb404a33a2d6","Type":"ContainerStarted","Data":"f3f7f03420186ef17424c6d5486d41ed7a7f0baa119081e8674280b9b7f9c82f"} Dec 01 11:40:26 crc kubenswrapper[4958]: I1201 11:40:26.775123 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" event={"ID":"69f0027d-f2a8-48b3-8e17-cb404a33a2d6","Type":"ContainerStarted","Data":"5c75bf213c77365732a06736ff4522b843431214cd129ad88fe7197802231553"} Dec 01 11:40:26 crc kubenswrapper[4958]: I1201 11:40:26.780727 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" event={"ID":"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb","Type":"ContainerStarted","Data":"3857b8805d12d606d1985dc4bf5d26ed238125b19ad706c5c17738db92615797"} Dec 01 11:40:26 crc kubenswrapper[4958]: I1201 11:40:26.780894 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:26 crc kubenswrapper[4958]: I1201 11:40:26.793354 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" podStartSLOduration=2.793335067 podStartE2EDuration="2.793335067s" podCreationTimestamp="2025-12-01 11:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:26.789149538 +0000 UTC m=+6074.297938575" watchObservedRunningTime="2025-12-01 11:40:26.793335067 +0000 UTC m=+6074.302124104" Dec 01 11:40:26 crc kubenswrapper[4958]: I1201 11:40:26.818411 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" podStartSLOduration=2.818392589 podStartE2EDuration="2.818392589s" podCreationTimestamp="2025-12-01 11:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:26.807211731 +0000 UTC m=+6074.316000768" watchObservedRunningTime="2025-12-01 11:40:26.818392589 +0000 UTC m=+6074.327181626" Dec 01 11:40:29 crc kubenswrapper[4958]: I1201 11:40:29.501613 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:40:29 crc kubenswrapper[4958]: I1201 11:40:29.502035 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:40:29 crc kubenswrapper[4958]: I1201 11:40:29.523819 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:29 crc kubenswrapper[4958]: I1201 11:40:29.566227 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 11:40:29 crc kubenswrapper[4958]: I1201 11:40:29.821597 4958 generic.go:334] "Generic (PLEG): container finished" podID="69f0027d-f2a8-48b3-8e17-cb404a33a2d6" containerID="f3f7f03420186ef17424c6d5486d41ed7a7f0baa119081e8674280b9b7f9c82f" exitCode=0 Dec 01 11:40:29 crc kubenswrapper[4958]: I1201 11:40:29.822181 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" event={"ID":"69f0027d-f2a8-48b3-8e17-cb404a33a2d6","Type":"ContainerDied","Data":"f3f7f03420186ef17424c6d5486d41ed7a7f0baa119081e8674280b9b7f9c82f"} Dec 01 11:40:30 crc kubenswrapper[4958]: I1201 11:40:30.837183 4958 generic.go:334] "Generic (PLEG): container finished" podID="0b385e57-4fa1-4d10-afef-2cd942607c11" containerID="215f0616431a98c2cb2fc8d89ae596aab07c48861a10b2ea35a33058e8dcc2ab" exitCode=0 Dec 01 11:40:30 crc kubenswrapper[4958]: I1201 11:40:30.837302 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gmblw" event={"ID":"0b385e57-4fa1-4d10-afef-2cd942607c11","Type":"ContainerDied","Data":"215f0616431a98c2cb2fc8d89ae596aab07c48861a10b2ea35a33058e8dcc2ab"} Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.325448 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.423102 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4hnn\" (UniqueName: \"kubernetes.io/projected/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-kube-api-access-f4hnn\") pod \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.423514 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-config-data\") pod \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.423617 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-combined-ca-bundle\") pod \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.423828 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-scripts\") pod \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\" (UID: \"69f0027d-f2a8-48b3-8e17-cb404a33a2d6\") " Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.431095 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-scripts" (OuterVolumeSpecName: "scripts") pod "69f0027d-f2a8-48b3-8e17-cb404a33a2d6" (UID: "69f0027d-f2a8-48b3-8e17-cb404a33a2d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.431922 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-kube-api-access-f4hnn" (OuterVolumeSpecName: "kube-api-access-f4hnn") pod "69f0027d-f2a8-48b3-8e17-cb404a33a2d6" (UID: "69f0027d-f2a8-48b3-8e17-cb404a33a2d6"). InnerVolumeSpecName "kube-api-access-f4hnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.450957 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69f0027d-f2a8-48b3-8e17-cb404a33a2d6" (UID: "69f0027d-f2a8-48b3-8e17-cb404a33a2d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.481619 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-config-data" (OuterVolumeSpecName: "config-data") pod "69f0027d-f2a8-48b3-8e17-cb404a33a2d6" (UID: "69f0027d-f2a8-48b3-8e17-cb404a33a2d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.526045 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.526087 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.526100 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4hnn\" (UniqueName: \"kubernetes.io/projected/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-kube-api-access-f4hnn\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.526113 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f0027d-f2a8-48b3-8e17-cb404a33a2d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.887054 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.888693 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pdwfr" event={"ID":"69f0027d-f2a8-48b3-8e17-cb404a33a2d6","Type":"ContainerDied","Data":"5c75bf213c77365732a06736ff4522b843431214cd129ad88fe7197802231553"} Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.888771 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c75bf213c77365732a06736ff4522b843431214cd129ad88fe7197802231553" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.953329 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:40:31 crc kubenswrapper[4958]: E1201 11:40:31.953978 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f0027d-f2a8-48b3-8e17-cb404a33a2d6" containerName="nova-cell1-conductor-db-sync" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.954005 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f0027d-f2a8-48b3-8e17-cb404a33a2d6" containerName="nova-cell1-conductor-db-sync" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.954247 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f0027d-f2a8-48b3-8e17-cb404a33a2d6" containerName="nova-cell1-conductor-db-sync" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.955265 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.960859 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 11:40:31 crc kubenswrapper[4958]: I1201 11:40:31.964586 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.037533 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.037948 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mw9z\" (UniqueName: \"kubernetes.io/projected/515ccd90-5b0b-43a2-a599-992fa27dd517-kube-api-access-9mw9z\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.038302 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.140433 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mw9z\" (UniqueName: \"kubernetes.io/projected/515ccd90-5b0b-43a2-a599-992fa27dd517-kube-api-access-9mw9z\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.140629 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.140709 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.147118 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.147872 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.158564 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mw9z\" (UniqueName: \"kubernetes.io/projected/515ccd90-5b0b-43a2-a599-992fa27dd517-kube-api-access-9mw9z\") pod \"nova-cell1-conductor-0\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.255341 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.288979 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.344728 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-scripts\") pod \"0b385e57-4fa1-4d10-afef-2cd942607c11\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.345137 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-combined-ca-bundle\") pod \"0b385e57-4fa1-4d10-afef-2cd942607c11\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.345276 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-config-data\") pod \"0b385e57-4fa1-4d10-afef-2cd942607c11\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.345447 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9txf\" (UniqueName: \"kubernetes.io/projected/0b385e57-4fa1-4d10-afef-2cd942607c11-kube-api-access-p9txf\") pod \"0b385e57-4fa1-4d10-afef-2cd942607c11\" (UID: \"0b385e57-4fa1-4d10-afef-2cd942607c11\") " Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.352265 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b385e57-4fa1-4d10-afef-2cd942607c11-kube-api-access-p9txf" (OuterVolumeSpecName: "kube-api-access-p9txf") pod "0b385e57-4fa1-4d10-afef-2cd942607c11" (UID: "0b385e57-4fa1-4d10-afef-2cd942607c11"). InnerVolumeSpecName "kube-api-access-p9txf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.352293 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-scripts" (OuterVolumeSpecName: "scripts") pod "0b385e57-4fa1-4d10-afef-2cd942607c11" (UID: "0b385e57-4fa1-4d10-afef-2cd942607c11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.384593 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b385e57-4fa1-4d10-afef-2cd942607c11" (UID: "0b385e57-4fa1-4d10-afef-2cd942607c11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.403703 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-config-data" (OuterVolumeSpecName: "config-data") pod "0b385e57-4fa1-4d10-afef-2cd942607c11" (UID: "0b385e57-4fa1-4d10-afef-2cd942607c11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.447443 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.447496 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.447511 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b385e57-4fa1-4d10-afef-2cd942607c11-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.447525 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9txf\" (UniqueName: \"kubernetes.io/projected/0b385e57-4fa1-4d10-afef-2cd942607c11-kube-api-access-p9txf\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.774963 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.914343 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gmblw" event={"ID":"0b385e57-4fa1-4d10-afef-2cd942607c11","Type":"ContainerDied","Data":"8ca1307c04062d66045c09ddafc34d6829b6b3f3777f099f3bbba151af1bc62d"} Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.914466 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ca1307c04062d66045c09ddafc34d6829b6b3f3777f099f3bbba151af1bc62d" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.914396 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gmblw" Dec 01 11:40:32 crc kubenswrapper[4958]: I1201 11:40:32.917154 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"515ccd90-5b0b-43a2-a599-992fa27dd517","Type":"ContainerStarted","Data":"5f48329c3f61fd98aabcafabe1935cec448256ccd9263ec7bb48a724e33ecb17"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.183314 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.183591 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-log" containerID="cri-o://7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403" gracePeriod=30 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.184156 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-api" containerID="cri-o://51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8" gracePeriod=30 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.214143 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.214543 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="12a786d1-2169-4b70-a6df-781fa8fad94c" containerName="nova-scheduler-scheduler" containerID="cri-o://f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6" gracePeriod=30 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.248053 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.248335 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-log" containerID="cri-o://4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267" gracePeriod=30 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.248583 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-metadata" containerID="cri-o://1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b" gracePeriod=30 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.838635 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.853118 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882124 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-combined-ca-bundle\") pod \"22070023-b108-4113-91e1-51c7cde2c3d9\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882242 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcf83f3-b3be-49b5-9719-57857e262839-logs\") pod \"dfcf83f3-b3be-49b5-9719-57857e262839\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882313 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn5x8\" (UniqueName: \"kubernetes.io/projected/dfcf83f3-b3be-49b5-9719-57857e262839-kube-api-access-vn5x8\") pod \"dfcf83f3-b3be-49b5-9719-57857e262839\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882355 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-config-data\") pod \"22070023-b108-4113-91e1-51c7cde2c3d9\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882388 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-combined-ca-bundle\") pod \"dfcf83f3-b3be-49b5-9719-57857e262839\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882456 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22070023-b108-4113-91e1-51c7cde2c3d9-logs\") pod \"22070023-b108-4113-91e1-51c7cde2c3d9\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882497 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-config-data\") pod \"dfcf83f3-b3be-49b5-9719-57857e262839\" (UID: \"dfcf83f3-b3be-49b5-9719-57857e262839\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.882535 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szb74\" (UniqueName: \"kubernetes.io/projected/22070023-b108-4113-91e1-51c7cde2c3d9-kube-api-access-szb74\") pod \"22070023-b108-4113-91e1-51c7cde2c3d9\" (UID: \"22070023-b108-4113-91e1-51c7cde2c3d9\") " Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.888307 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22070023-b108-4113-91e1-51c7cde2c3d9-kube-api-access-szb74" (OuterVolumeSpecName: "kube-api-access-szb74") pod "22070023-b108-4113-91e1-51c7cde2c3d9" (UID: "22070023-b108-4113-91e1-51c7cde2c3d9"). InnerVolumeSpecName "kube-api-access-szb74". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.895437 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22070023-b108-4113-91e1-51c7cde2c3d9-logs" (OuterVolumeSpecName: "logs") pod "22070023-b108-4113-91e1-51c7cde2c3d9" (UID: "22070023-b108-4113-91e1-51c7cde2c3d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.895562 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfcf83f3-b3be-49b5-9719-57857e262839-logs" (OuterVolumeSpecName: "logs") pod "dfcf83f3-b3be-49b5-9719-57857e262839" (UID: "dfcf83f3-b3be-49b5-9719-57857e262839"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.909103 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfcf83f3-b3be-49b5-9719-57857e262839-kube-api-access-vn5x8" (OuterVolumeSpecName: "kube-api-access-vn5x8") pod "dfcf83f3-b3be-49b5-9719-57857e262839" (UID: "dfcf83f3-b3be-49b5-9719-57857e262839"). InnerVolumeSpecName "kube-api-access-vn5x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.924664 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfcf83f3-b3be-49b5-9719-57857e262839" (UID: "dfcf83f3-b3be-49b5-9719-57857e262839"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.932620 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-config-data" (OuterVolumeSpecName: "config-data") pod "dfcf83f3-b3be-49b5-9719-57857e262839" (UID: "dfcf83f3-b3be-49b5-9719-57857e262839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.933232 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-config-data" (OuterVolumeSpecName: "config-data") pod "22070023-b108-4113-91e1-51c7cde2c3d9" (UID: "22070023-b108-4113-91e1-51c7cde2c3d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934477 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22070023-b108-4113-91e1-51c7cde2c3d9" (UID: "22070023-b108-4113-91e1-51c7cde2c3d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934493 4958 generic.go:334] "Generic (PLEG): container finished" podID="22070023-b108-4113-91e1-51c7cde2c3d9" containerID="51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8" exitCode=0 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934515 4958 generic.go:334] "Generic (PLEG): container finished" podID="22070023-b108-4113-91e1-51c7cde2c3d9" containerID="7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403" exitCode=143 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934570 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934580 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22070023-b108-4113-91e1-51c7cde2c3d9","Type":"ContainerDied","Data":"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934612 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22070023-b108-4113-91e1-51c7cde2c3d9","Type":"ContainerDied","Data":"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934621 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22070023-b108-4113-91e1-51c7cde2c3d9","Type":"ContainerDied","Data":"d6ca57e38864b4fbf9451e01165d303195c0998e41660e4729aab2435cd93e87"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.934637 4958 scope.go:117] "RemoveContainer" containerID="51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.939497 4958 generic.go:334] "Generic (PLEG): container finished" podID="dfcf83f3-b3be-49b5-9719-57857e262839" containerID="1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b" exitCode=0 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.939521 4958 generic.go:334] "Generic (PLEG): container finished" podID="dfcf83f3-b3be-49b5-9719-57857e262839" containerID="4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267" exitCode=143 Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.939554 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dfcf83f3-b3be-49b5-9719-57857e262839","Type":"ContainerDied","Data":"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.939574 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dfcf83f3-b3be-49b5-9719-57857e262839","Type":"ContainerDied","Data":"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.939586 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dfcf83f3-b3be-49b5-9719-57857e262839","Type":"ContainerDied","Data":"f527802b71817c780436eeb025faf315c97de48279003b2f143d9f15c119e828"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.939624 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.944884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"515ccd90-5b0b-43a2-a599-992fa27dd517","Type":"ContainerStarted","Data":"216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5"} Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.946546 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.966938 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.966919448 podStartE2EDuration="2.966919448s" podCreationTimestamp="2025-12-01 11:40:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:33.963588083 +0000 UTC m=+6081.472377120" watchObservedRunningTime="2025-12-01 11:40:33.966919448 +0000 UTC m=+6081.475708475" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.969254 4958 scope.go:117] "RemoveContainer" containerID="7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984119 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984145 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfcf83f3-b3be-49b5-9719-57857e262839-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984155 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn5x8\" (UniqueName: \"kubernetes.io/projected/dfcf83f3-b3be-49b5-9719-57857e262839-kube-api-access-vn5x8\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984167 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22070023-b108-4113-91e1-51c7cde2c3d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984178 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984187 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22070023-b108-4113-91e1-51c7cde2c3d9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984195 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfcf83f3-b3be-49b5-9719-57857e262839-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:33 crc kubenswrapper[4958]: I1201 11:40:33.984204 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szb74\" (UniqueName: \"kubernetes.io/projected/22070023-b108-4113-91e1-51c7cde2c3d9-kube-api-access-szb74\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.012210 4958 scope.go:117] "RemoveContainer" containerID="51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.012920 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8\": container with ID starting with 51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8 not found: ID does not exist" containerID="51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.012952 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8"} err="failed to get container status \"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8\": rpc error: code = NotFound desc = could not find container \"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8\": container with ID starting with 51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8 not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.012975 4958 scope.go:117] "RemoveContainer" containerID="7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.013267 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403\": container with ID starting with 7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403 not found: ID does not exist" containerID="7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.013288 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403"} err="failed to get container status \"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403\": rpc error: code = NotFound desc = could not find container \"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403\": container with ID starting with 7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403 not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.013301 4958 scope.go:117] "RemoveContainer" containerID="51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.013590 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.015616 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8"} err="failed to get container status \"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8\": rpc error: code = NotFound desc = could not find container \"51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8\": container with ID starting with 51361860643e78b1cd0a3beee0083e3c5698c39bc27341d1077da9e4880c58c8 not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.015633 4958 scope.go:117] "RemoveContainer" containerID="7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.015865 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403"} err="failed to get container status \"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403\": rpc error: code = NotFound desc = could not find container \"7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403\": container with ID starting with 7ea04966a94413987547a7d4a34f7bfa8cc95ff41a53016b44880ffe87c70403 not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.015880 4958 scope.go:117] "RemoveContainer" containerID="1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.036100 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.049066 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.049179 4958 scope.go:117] "RemoveContainer" containerID="4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.053923 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.064511 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.064919 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b385e57-4fa1-4d10-afef-2cd942607c11" containerName="nova-manage" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.064932 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b385e57-4fa1-4d10-afef-2cd942607c11" containerName="nova-manage" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.064945 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-api" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.064951 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-api" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.064977 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-log" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.064983 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-log" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.064993 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-log" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065000 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-log" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.065020 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-metadata" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065027 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-metadata" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065196 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-log" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065213 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" containerName="nova-api-api" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065227 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-metadata" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065242 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" containerName="nova-metadata-log" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.065253 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b385e57-4fa1-4d10-afef-2cd942607c11" containerName="nova-manage" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.066344 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.071065 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.094917 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.097988 4958 scope.go:117] "RemoveContainer" containerID="1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.105004 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b\": container with ID starting with 1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b not found: ID does not exist" containerID="1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.105083 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b"} err="failed to get container status \"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b\": rpc error: code = NotFound desc = could not find container \"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b\": container with ID starting with 1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.105119 4958 scope.go:117] "RemoveContainer" containerID="4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.106438 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: E1201 11:40:34.110299 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267\": container with ID starting with 4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267 not found: ID does not exist" containerID="4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.110367 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267"} err="failed to get container status \"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267\": rpc error: code = NotFound desc = could not find container \"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267\": container with ID starting with 4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267 not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.110439 4958 scope.go:117] "RemoveContainer" containerID="1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.112829 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b"} err="failed to get container status \"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b\": rpc error: code = NotFound desc = could not find container \"1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b\": container with ID starting with 1e500f0a201702a17ba41eeb89e67c8c2595904fb6e62d951249be09ddfd7e8b not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.112873 4958 scope.go:117] "RemoveContainer" containerID="4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.113449 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267"} err="failed to get container status \"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267\": rpc error: code = NotFound desc = could not find container \"4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267\": container with ID starting with 4ced5af5e99dddf95cfa0aaa7beb3d926800fadfea550d2a4ecee8500048b267 not found: ID does not exist" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.113884 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.114378 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.135058 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.208101 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.208270 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/699fc36c-be0e-490d-ae04-867adb807761-logs\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.208309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlg2\" (UniqueName: \"kubernetes.io/projected/699fc36c-be0e-490d-ae04-867adb807761-kube-api-access-zwlg2\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.208337 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-logs\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.208375 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-config-data\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.208606 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-config-data\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.209176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.209271 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4fck\" (UniqueName: \"kubernetes.io/projected/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-kube-api-access-p4fck\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.311743 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.311835 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4fck\" (UniqueName: \"kubernetes.io/projected/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-kube-api-access-p4fck\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.313203 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.313291 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/699fc36c-be0e-490d-ae04-867adb807761-logs\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.313334 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlg2\" (UniqueName: \"kubernetes.io/projected/699fc36c-be0e-490d-ae04-867adb807761-kube-api-access-zwlg2\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.313365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-logs\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.313403 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-config-data\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.313457 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-config-data\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.314172 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-logs\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.314739 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/699fc36c-be0e-490d-ae04-867adb807761-logs\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.318556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-config-data\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.320199 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.323776 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-config-data\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.324049 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.335564 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4fck\" (UniqueName: \"kubernetes.io/projected/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-kube-api-access-p4fck\") pod \"nova-metadata-0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.344753 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlg2\" (UniqueName: \"kubernetes.io/projected/699fc36c-be0e-490d-ae04-867adb807761-kube-api-access-zwlg2\") pod \"nova-api-0\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.391959 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.432349 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.526501 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.550115 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.554762 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.680113 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f96cb89d9-pz99v"] Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.680738 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerName="dnsmasq-dns" containerID="cri-o://b5dbc0379892a37fd9c04b78af98b7de12b1aa42e5bfcd7452e07c49b66b0033" gracePeriod=10 Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.962656 4958 generic.go:334] "Generic (PLEG): container finished" podID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerID="b5dbc0379892a37fd9c04b78af98b7de12b1aa42e5bfcd7452e07c49b66b0033" exitCode=0 Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.963126 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" event={"ID":"5ce26e97-9a7c-43b5-8355-6787ece6d948","Type":"ContainerDied","Data":"b5dbc0379892a37fd9c04b78af98b7de12b1aa42e5bfcd7452e07c49b66b0033"} Dec 01 11:40:34 crc kubenswrapper[4958]: I1201 11:40:34.983678 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.101998 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.160487 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:35 crc kubenswrapper[4958]: W1201 11:40:35.169096 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod699fc36c_be0e_490d_ae04_867adb807761.slice/crio-5d3a4c24bcf3afa732a4ce425256ddca4f9e56d05d6bad6dac68da3761c975ca WatchSource:0}: Error finding container 5d3a4c24bcf3afa732a4ce425256ddca4f9e56d05d6bad6dac68da3761c975ca: Status 404 returned error can't find the container with id 5d3a4c24bcf3afa732a4ce425256ddca4f9e56d05d6bad6dac68da3761c975ca Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.669912 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.746785 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pz9s\" (UniqueName: \"kubernetes.io/projected/5ce26e97-9a7c-43b5-8355-6787ece6d948-kube-api-access-8pz9s\") pod \"5ce26e97-9a7c-43b5-8355-6787ece6d948\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.746876 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-sb\") pod \"5ce26e97-9a7c-43b5-8355-6787ece6d948\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.746991 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-config\") pod \"5ce26e97-9a7c-43b5-8355-6787ece6d948\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.747708 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-dns-svc\") pod \"5ce26e97-9a7c-43b5-8355-6787ece6d948\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.747756 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-nb\") pod \"5ce26e97-9a7c-43b5-8355-6787ece6d948\" (UID: \"5ce26e97-9a7c-43b5-8355-6787ece6d948\") " Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.759933 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce26e97-9a7c-43b5-8355-6787ece6d948-kube-api-access-8pz9s" (OuterVolumeSpecName: "kube-api-access-8pz9s") pod "5ce26e97-9a7c-43b5-8355-6787ece6d948" (UID: "5ce26e97-9a7c-43b5-8355-6787ece6d948"). InnerVolumeSpecName "kube-api-access-8pz9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.812796 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22070023-b108-4113-91e1-51c7cde2c3d9" path="/var/lib/kubelet/pods/22070023-b108-4113-91e1-51c7cde2c3d9/volumes" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.815134 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfcf83f3-b3be-49b5-9719-57857e262839" path="/var/lib/kubelet/pods/dfcf83f3-b3be-49b5-9719-57857e262839/volumes" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.838719 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ce26e97-9a7c-43b5-8355-6787ece6d948" (UID: "5ce26e97-9a7c-43b5-8355-6787ece6d948"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.842760 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-config" (OuterVolumeSpecName: "config") pod "5ce26e97-9a7c-43b5-8355-6787ece6d948" (UID: "5ce26e97-9a7c-43b5-8355-6787ece6d948"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.846076 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ce26e97-9a7c-43b5-8355-6787ece6d948" (UID: "5ce26e97-9a7c-43b5-8355-6787ece6d948"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.850572 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.850597 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.850607 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.850619 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pz9s\" (UniqueName: \"kubernetes.io/projected/5ce26e97-9a7c-43b5-8355-6787ece6d948-kube-api-access-8pz9s\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.866576 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ce26e97-9a7c-43b5-8355-6787ece6d948" (UID: "5ce26e97-9a7c-43b5-8355-6787ece6d948"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.953493 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ce26e97-9a7c-43b5-8355-6787ece6d948-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.986462 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"699fc36c-be0e-490d-ae04-867adb807761","Type":"ContainerStarted","Data":"ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.986534 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"699fc36c-be0e-490d-ae04-867adb807761","Type":"ContainerStarted","Data":"cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.986560 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"699fc36c-be0e-490d-ae04-867adb807761","Type":"ContainerStarted","Data":"5d3a4c24bcf3afa732a4ce425256ddca4f9e56d05d6bad6dac68da3761c975ca"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.993441 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0","Type":"ContainerStarted","Data":"5e2c653883026b215cdf9538213435ef1fc498a14de21d36ab4f2e0fd745539c"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.993473 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0","Type":"ContainerStarted","Data":"d9b246cc1ab051ae21e36aa41cee94c6bf5431ba340f3a46b33b6a20498311c9"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.993483 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0","Type":"ContainerStarted","Data":"b8dbb21c2dc5a63a62092e0cdc94b7491402402bd7b7e647b1fb948f44b289bc"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.996044 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" event={"ID":"5ce26e97-9a7c-43b5-8355-6787ece6d948","Type":"ContainerDied","Data":"78390a0e7b021f1def7af9410689cff8b01866a8ffda0f8be532d3751b178689"} Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.996084 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f96cb89d9-pz99v" Dec 01 11:40:35 crc kubenswrapper[4958]: I1201 11:40:35.996117 4958 scope.go:117] "RemoveContainer" containerID="b5dbc0379892a37fd9c04b78af98b7de12b1aa42e5bfcd7452e07c49b66b0033" Dec 01 11:40:36 crc kubenswrapper[4958]: I1201 11:40:36.009029 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.009006078 podStartE2EDuration="3.009006078s" podCreationTimestamp="2025-12-01 11:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:36.00449358 +0000 UTC m=+6083.513282617" watchObservedRunningTime="2025-12-01 11:40:36.009006078 +0000 UTC m=+6083.517795115" Dec 01 11:40:36 crc kubenswrapper[4958]: I1201 11:40:36.024588 4958 scope.go:117] "RemoveContainer" containerID="4660dcac7a423c49314b7edccb0bc5ff621cc8bcc1f6c5463f92fddb1ee83b57" Dec 01 11:40:36 crc kubenswrapper[4958]: I1201 11:40:36.031462 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.031443255 podStartE2EDuration="3.031443255s" podCreationTimestamp="2025-12-01 11:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:36.027963916 +0000 UTC m=+6083.536752993" watchObservedRunningTime="2025-12-01 11:40:36.031443255 +0000 UTC m=+6083.540232292" Dec 01 11:40:36 crc kubenswrapper[4958]: I1201 11:40:36.056121 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f96cb89d9-pz99v"] Dec 01 11:40:36 crc kubenswrapper[4958]: I1201 11:40:36.065054 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f96cb89d9-pz99v"] Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.332481 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.585408 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.719705 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-combined-ca-bundle\") pod \"12a786d1-2169-4b70-a6df-781fa8fad94c\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.719828 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dxcs\" (UniqueName: \"kubernetes.io/projected/12a786d1-2169-4b70-a6df-781fa8fad94c-kube-api-access-7dxcs\") pod \"12a786d1-2169-4b70-a6df-781fa8fad94c\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.719916 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-config-data\") pod \"12a786d1-2169-4b70-a6df-781fa8fad94c\" (UID: \"12a786d1-2169-4b70-a6df-781fa8fad94c\") " Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.738484 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a786d1-2169-4b70-a6df-781fa8fad94c-kube-api-access-7dxcs" (OuterVolumeSpecName: "kube-api-access-7dxcs") pod "12a786d1-2169-4b70-a6df-781fa8fad94c" (UID: "12a786d1-2169-4b70-a6df-781fa8fad94c"). InnerVolumeSpecName "kube-api-access-7dxcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.755369 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12a786d1-2169-4b70-a6df-781fa8fad94c" (UID: "12a786d1-2169-4b70-a6df-781fa8fad94c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.769998 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-config-data" (OuterVolumeSpecName: "config-data") pod "12a786d1-2169-4b70-a6df-781fa8fad94c" (UID: "12a786d1-2169-4b70-a6df-781fa8fad94c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.811419 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" path="/var/lib/kubelet/pods/5ce26e97-9a7c-43b5-8355-6787ece6d948/volumes" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.821471 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-v5jcd"] Dec 01 11:40:37 crc kubenswrapper[4958]: E1201 11:40:37.821819 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerName="dnsmasq-dns" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.821837 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerName="dnsmasq-dns" Dec 01 11:40:37 crc kubenswrapper[4958]: E1201 11:40:37.821870 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a786d1-2169-4b70-a6df-781fa8fad94c" containerName="nova-scheduler-scheduler" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.821880 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a786d1-2169-4b70-a6df-781fa8fad94c" containerName="nova-scheduler-scheduler" Dec 01 11:40:37 crc kubenswrapper[4958]: E1201 11:40:37.821904 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerName="init" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.821910 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerName="init" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.822081 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a786d1-2169-4b70-a6df-781fa8fad94c" containerName="nova-scheduler-scheduler" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.822096 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce26e97-9a7c-43b5-8355-6787ece6d948" containerName="dnsmasq-dns" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.822831 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.823047 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.823080 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dxcs\" (UniqueName: \"kubernetes.io/projected/12a786d1-2169-4b70-a6df-781fa8fad94c-kube-api-access-7dxcs\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.823091 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a786d1-2169-4b70-a6df-781fa8fad94c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.825928 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.829342 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.834629 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5jcd"] Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.924635 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-config-data\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.925579 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4hs\" (UniqueName: \"kubernetes.io/projected/c13e7610-d97c-4701-bfbc-a7e97d3f3909-kube-api-access-gd4hs\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.928896 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-scripts\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:37 crc kubenswrapper[4958]: I1201 11:40:37.929020 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.023714 4958 generic.go:334] "Generic (PLEG): container finished" podID="12a786d1-2169-4b70-a6df-781fa8fad94c" containerID="f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6" exitCode=0 Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.023764 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.023773 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12a786d1-2169-4b70-a6df-781fa8fad94c","Type":"ContainerDied","Data":"f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6"} Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.023810 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12a786d1-2169-4b70-a6df-781fa8fad94c","Type":"ContainerDied","Data":"91677e17c6220831819723342c8c51b64331427499c978325bd7ec0fd804a192"} Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.023829 4958 scope.go:117] "RemoveContainer" containerID="f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.030798 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.030989 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-config-data\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.031069 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4hs\" (UniqueName: \"kubernetes.io/projected/c13e7610-d97c-4701-bfbc-a7e97d3f3909-kube-api-access-gd4hs\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.031146 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-scripts\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.034784 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-scripts\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.035463 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.045144 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-config-data\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.061051 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.070494 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4hs\" (UniqueName: \"kubernetes.io/projected/c13e7610-d97c-4701-bfbc-a7e97d3f3909-kube-api-access-gd4hs\") pod \"nova-cell1-cell-mapping-v5jcd\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.071026 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.085759 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.086973 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.089592 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.102152 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.135093 4958 scope.go:117] "RemoveContainer" containerID="f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6" Dec 01 11:40:38 crc kubenswrapper[4958]: E1201 11:40:38.135529 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6\": container with ID starting with f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6 not found: ID does not exist" containerID="f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.135588 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6"} err="failed to get container status \"f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6\": rpc error: code = NotFound desc = could not find container \"f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6\": container with ID starting with f4ff471616460010d5ec1bd3e4c436c28c313f042e73c5171dd9fdfdfe1f81d6 not found: ID does not exist" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.179054 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.235269 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.235416 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxcb\" (UniqueName: \"kubernetes.io/projected/e00e2cbe-314e-44e4-9c18-125827bb9e9a-kube-api-access-ttxcb\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.235463 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-config-data\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.337493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttxcb\" (UniqueName: \"kubernetes.io/projected/e00e2cbe-314e-44e4-9c18-125827bb9e9a-kube-api-access-ttxcb\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.337985 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-config-data\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.338110 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.345881 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.347056 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-config-data\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.361335 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttxcb\" (UniqueName: \"kubernetes.io/projected/e00e2cbe-314e-44e4-9c18-125827bb9e9a-kube-api-access-ttxcb\") pod \"nova-scheduler-0\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.405444 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:38 crc kubenswrapper[4958]: I1201 11:40:38.745221 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5jcd"] Dec 01 11:40:39 crc kubenswrapper[4958]: I1201 11:40:39.041770 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5jcd" event={"ID":"c13e7610-d97c-4701-bfbc-a7e97d3f3909","Type":"ContainerStarted","Data":"8c77e170d2f2c67044e51f7b7194ad32c914e0e63dcea5f8bd28b548ecfaa6e1"} Dec 01 11:40:39 crc kubenswrapper[4958]: W1201 11:40:39.067166 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode00e2cbe_314e_44e4_9c18_125827bb9e9a.slice/crio-99ba867bf5feb143b271475ae111b536b655eaed127886613615e8e146041b55 WatchSource:0}: Error finding container 99ba867bf5feb143b271475ae111b536b655eaed127886613615e8e146041b55: Status 404 returned error can't find the container with id 99ba867bf5feb143b271475ae111b536b655eaed127886613615e8e146041b55 Dec 01 11:40:39 crc kubenswrapper[4958]: I1201 11:40:39.071192 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:39 crc kubenswrapper[4958]: I1201 11:40:39.432977 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:40:39 crc kubenswrapper[4958]: I1201 11:40:39.433053 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:40:39 crc kubenswrapper[4958]: I1201 11:40:39.816036 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a786d1-2169-4b70-a6df-781fa8fad94c" path="/var/lib/kubelet/pods/12a786d1-2169-4b70-a6df-781fa8fad94c/volumes" Dec 01 11:40:40 crc kubenswrapper[4958]: I1201 11:40:40.069120 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5jcd" event={"ID":"c13e7610-d97c-4701-bfbc-a7e97d3f3909","Type":"ContainerStarted","Data":"66ebdcb1705867395e3ae0597e83a9c008d70ada8f0c9c6a59369677354b3847"} Dec 01 11:40:40 crc kubenswrapper[4958]: I1201 11:40:40.075061 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e00e2cbe-314e-44e4-9c18-125827bb9e9a","Type":"ContainerStarted","Data":"7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73"} Dec 01 11:40:40 crc kubenswrapper[4958]: I1201 11:40:40.075125 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e00e2cbe-314e-44e4-9c18-125827bb9e9a","Type":"ContainerStarted","Data":"99ba867bf5feb143b271475ae111b536b655eaed127886613615e8e146041b55"} Dec 01 11:40:40 crc kubenswrapper[4958]: I1201 11:40:40.099479 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-v5jcd" podStartSLOduration=3.099429596 podStartE2EDuration="3.099429596s" podCreationTimestamp="2025-12-01 11:40:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:40.091631424 +0000 UTC m=+6087.600420531" watchObservedRunningTime="2025-12-01 11:40:40.099429596 +0000 UTC m=+6087.608218633" Dec 01 11:40:40 crc kubenswrapper[4958]: I1201 11:40:40.121066 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.121041769 podStartE2EDuration="2.121041769s" podCreationTimestamp="2025-12-01 11:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:40.119635079 +0000 UTC m=+6087.628424166" watchObservedRunningTime="2025-12-01 11:40:40.121041769 +0000 UTC m=+6087.629830816" Dec 01 11:40:43 crc kubenswrapper[4958]: I1201 11:40:43.406041 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 11:40:44 crc kubenswrapper[4958]: I1201 11:40:44.129624 4958 generic.go:334] "Generic (PLEG): container finished" podID="c13e7610-d97c-4701-bfbc-a7e97d3f3909" containerID="66ebdcb1705867395e3ae0597e83a9c008d70ada8f0c9c6a59369677354b3847" exitCode=0 Dec 01 11:40:44 crc kubenswrapper[4958]: I1201 11:40:44.129691 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5jcd" event={"ID":"c13e7610-d97c-4701-bfbc-a7e97d3f3909","Type":"ContainerDied","Data":"66ebdcb1705867395e3ae0597e83a9c008d70ada8f0c9c6a59369677354b3847"} Dec 01 11:40:44 crc kubenswrapper[4958]: I1201 11:40:44.393441 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 11:40:44 crc kubenswrapper[4958]: I1201 11:40:44.393811 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 11:40:44 crc kubenswrapper[4958]: I1201 11:40:44.433411 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 11:40:44 crc kubenswrapper[4958]: I1201 11:40:44.433545 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.476091 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.560038 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.560388 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.560809 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.653676 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.805553 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-combined-ca-bundle\") pod \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.806556 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-config-data\") pod \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.807261 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4hs\" (UniqueName: \"kubernetes.io/projected/c13e7610-d97c-4701-bfbc-a7e97d3f3909-kube-api-access-gd4hs\") pod \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.807434 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-scripts\") pod \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\" (UID: \"c13e7610-d97c-4701-bfbc-a7e97d3f3909\") " Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.818191 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-scripts" (OuterVolumeSpecName: "scripts") pod "c13e7610-d97c-4701-bfbc-a7e97d3f3909" (UID: "c13e7610-d97c-4701-bfbc-a7e97d3f3909"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.818692 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13e7610-d97c-4701-bfbc-a7e97d3f3909-kube-api-access-gd4hs" (OuterVolumeSpecName: "kube-api-access-gd4hs") pod "c13e7610-d97c-4701-bfbc-a7e97d3f3909" (UID: "c13e7610-d97c-4701-bfbc-a7e97d3f3909"). InnerVolumeSpecName "kube-api-access-gd4hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.867985 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c13e7610-d97c-4701-bfbc-a7e97d3f3909" (UID: "c13e7610-d97c-4701-bfbc-a7e97d3f3909"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.889147 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-config-data" (OuterVolumeSpecName: "config-data") pod "c13e7610-d97c-4701-bfbc-a7e97d3f3909" (UID: "c13e7610-d97c-4701-bfbc-a7e97d3f3909"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.910101 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.910143 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.910153 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13e7610-d97c-4701-bfbc-a7e97d3f3909-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:45 crc kubenswrapper[4958]: I1201 11:40:45.910163 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4hs\" (UniqueName: \"kubernetes.io/projected/c13e7610-d97c-4701-bfbc-a7e97d3f3909-kube-api-access-gd4hs\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.156834 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5jcd" event={"ID":"c13e7610-d97c-4701-bfbc-a7e97d3f3909","Type":"ContainerDied","Data":"8c77e170d2f2c67044e51f7b7194ad32c914e0e63dcea5f8bd28b548ecfaa6e1"} Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.156926 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c77e170d2f2c67044e51f7b7194ad32c914e0e63dcea5f8bd28b548ecfaa6e1" Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.157002 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5jcd" Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.419549 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.419765 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e00e2cbe-314e-44e4-9c18-125827bb9e9a" containerName="nova-scheduler-scheduler" containerID="cri-o://7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73" gracePeriod=30 Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.429792 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.430571 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-log" containerID="cri-o://cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317" gracePeriod=30 Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.431105 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-api" containerID="cri-o://ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68" gracePeriod=30 Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.453188 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.453427 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-log" containerID="cri-o://d9b246cc1ab051ae21e36aa41cee94c6bf5431ba340f3a46b33b6a20498311c9" gracePeriod=30 Dec 01 11:40:46 crc kubenswrapper[4958]: I1201 11:40:46.453583 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-metadata" containerID="cri-o://5e2c653883026b215cdf9538213435ef1fc498a14de21d36ab4f2e0fd745539c" gracePeriod=30 Dec 01 11:40:47 crc kubenswrapper[4958]: I1201 11:40:47.171318 4958 generic.go:334] "Generic (PLEG): container finished" podID="699fc36c-be0e-490d-ae04-867adb807761" containerID="cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317" exitCode=143 Dec 01 11:40:47 crc kubenswrapper[4958]: I1201 11:40:47.171397 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"699fc36c-be0e-490d-ae04-867adb807761","Type":"ContainerDied","Data":"cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317"} Dec 01 11:40:47 crc kubenswrapper[4958]: I1201 11:40:47.173945 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerID="d9b246cc1ab051ae21e36aa41cee94c6bf5431ba340f3a46b33b6a20498311c9" exitCode=143 Dec 01 11:40:47 crc kubenswrapper[4958]: I1201 11:40:47.173980 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0","Type":"ContainerDied","Data":"d9b246cc1ab051ae21e36aa41cee94c6bf5431ba340f3a46b33b6a20498311c9"} Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.061895 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.220600 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.224109 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerID="5e2c653883026b215cdf9538213435ef1fc498a14de21d36ab4f2e0fd745539c" exitCode=0 Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.224185 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0","Type":"ContainerDied","Data":"5e2c653883026b215cdf9538213435ef1fc498a14de21d36ab4f2e0fd745539c"} Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.226430 4958 generic.go:334] "Generic (PLEG): container finished" podID="e00e2cbe-314e-44e4-9c18-125827bb9e9a" containerID="7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73" exitCode=0 Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.226489 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e00e2cbe-314e-44e4-9c18-125827bb9e9a","Type":"ContainerDied","Data":"7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73"} Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.226514 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e00e2cbe-314e-44e4-9c18-125827bb9e9a","Type":"ContainerDied","Data":"99ba867bf5feb143b271475ae111b536b655eaed127886613615e8e146041b55"} Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.226537 4958 scope.go:117] "RemoveContainer" containerID="7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.226676 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.230190 4958 generic.go:334] "Generic (PLEG): container finished" podID="699fc36c-be0e-490d-ae04-867adb807761" containerID="ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68" exitCode=0 Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.230224 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"699fc36c-be0e-490d-ae04-867adb807761","Type":"ContainerDied","Data":"ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68"} Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.230248 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"699fc36c-be0e-490d-ae04-867adb807761","Type":"ContainerDied","Data":"5d3a4c24bcf3afa732a4ce425256ddca4f9e56d05d6bad6dac68da3761c975ca"} Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.230289 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.254561 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttxcb\" (UniqueName: \"kubernetes.io/projected/e00e2cbe-314e-44e4-9c18-125827bb9e9a-kube-api-access-ttxcb\") pod \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.254670 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-config-data\") pod \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.254780 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-combined-ca-bundle\") pod \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\" (UID: \"e00e2cbe-314e-44e4-9c18-125827bb9e9a\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.258392 4958 scope.go:117] "RemoveContainer" containerID="7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.258834 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73\": container with ID starting with 7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73 not found: ID does not exist" containerID="7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.258878 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73"} err="failed to get container status \"7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73\": rpc error: code = NotFound desc = could not find container \"7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73\": container with ID starting with 7608f6c70b84e24ec35cecde04d8ecdd3c9d0a1dd859b911be51859ed06f7f73 not found: ID does not exist" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.258899 4958 scope.go:117] "RemoveContainer" containerID="ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.261327 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00e2cbe-314e-44e4-9c18-125827bb9e9a-kube-api-access-ttxcb" (OuterVolumeSpecName: "kube-api-access-ttxcb") pod "e00e2cbe-314e-44e4-9c18-125827bb9e9a" (UID: "e00e2cbe-314e-44e4-9c18-125827bb9e9a"). InnerVolumeSpecName "kube-api-access-ttxcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.293196 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e00e2cbe-314e-44e4-9c18-125827bb9e9a" (UID: "e00e2cbe-314e-44e4-9c18-125827bb9e9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.300188 4958 scope.go:117] "RemoveContainer" containerID="cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.302051 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-config-data" (OuterVolumeSpecName: "config-data") pod "e00e2cbe-314e-44e4-9c18-125827bb9e9a" (UID: "e00e2cbe-314e-44e4-9c18-125827bb9e9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.325020 4958 scope.go:117] "RemoveContainer" containerID="ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.325640 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68\": container with ID starting with ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68 not found: ID does not exist" containerID="ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.325710 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68"} err="failed to get container status \"ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68\": rpc error: code = NotFound desc = could not find container \"ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68\": container with ID starting with ebea4dab3b9d08e7f1f0d0c26e51fad33c5cabe9c155e714b4f2f94bd3ddac68 not found: ID does not exist" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.325744 4958 scope.go:117] "RemoveContainer" containerID="cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.326195 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317\": container with ID starting with cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317 not found: ID does not exist" containerID="cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.326251 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317"} err="failed to get container status \"cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317\": rpc error: code = NotFound desc = could not find container \"cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317\": container with ID starting with cb1defa6ca844a6c29c824d2ed264baecab3ed749827722189a72eb8cc5e9317 not found: ID does not exist" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.343241 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357235 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-config-data\") pod \"699fc36c-be0e-490d-ae04-867adb807761\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357306 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-combined-ca-bundle\") pod \"699fc36c-be0e-490d-ae04-867adb807761\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357361 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4fck\" (UniqueName: \"kubernetes.io/projected/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-kube-api-access-p4fck\") pod \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357387 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwlg2\" (UniqueName: \"kubernetes.io/projected/699fc36c-be0e-490d-ae04-867adb807761-kube-api-access-zwlg2\") pod \"699fc36c-be0e-490d-ae04-867adb807761\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357406 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/699fc36c-be0e-490d-ae04-867adb807761-logs\") pod \"699fc36c-be0e-490d-ae04-867adb807761\" (UID: \"699fc36c-be0e-490d-ae04-867adb807761\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357430 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-logs\") pod \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357456 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-combined-ca-bundle\") pod \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357492 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-config-data\") pod \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\" (UID: \"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0\") " Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357786 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttxcb\" (UniqueName: \"kubernetes.io/projected/e00e2cbe-314e-44e4-9c18-125827bb9e9a-kube-api-access-ttxcb\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357801 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.357810 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e00e2cbe-314e-44e4-9c18-125827bb9e9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.359561 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699fc36c-be0e-490d-ae04-867adb807761-logs" (OuterVolumeSpecName: "logs") pod "699fc36c-be0e-490d-ae04-867adb807761" (UID: "699fc36c-be0e-490d-ae04-867adb807761"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.360014 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-logs" (OuterVolumeSpecName: "logs") pod "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" (UID: "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.362164 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699fc36c-be0e-490d-ae04-867adb807761-kube-api-access-zwlg2" (OuterVolumeSpecName: "kube-api-access-zwlg2") pod "699fc36c-be0e-490d-ae04-867adb807761" (UID: "699fc36c-be0e-490d-ae04-867adb807761"). InnerVolumeSpecName "kube-api-access-zwlg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.362824 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-kube-api-access-p4fck" (OuterVolumeSpecName: "kube-api-access-p4fck") pod "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" (UID: "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0"). InnerVolumeSpecName "kube-api-access-p4fck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.390112 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-config-data" (OuterVolumeSpecName: "config-data") pod "699fc36c-be0e-490d-ae04-867adb807761" (UID: "699fc36c-be0e-490d-ae04-867adb807761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.393950 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "699fc36c-be0e-490d-ae04-867adb807761" (UID: "699fc36c-be0e-490d-ae04-867adb807761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.394042 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" (UID: "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.405328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-config-data" (OuterVolumeSpecName: "config-data") pod "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" (UID: "d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459335 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459384 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459395 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459404 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fc36c-be0e-490d-ae04-867adb807761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459413 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4fck\" (UniqueName: \"kubernetes.io/projected/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-kube-api-access-p4fck\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459424 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwlg2\" (UniqueName: \"kubernetes.io/projected/699fc36c-be0e-490d-ae04-867adb807761-kube-api-access-zwlg2\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459433 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/699fc36c-be0e-490d-ae04-867adb807761-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.459441 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.575388 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.589628 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.606839 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.621206 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.629067 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.629836 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00e2cbe-314e-44e4-9c18-125827bb9e9a" containerName="nova-scheduler-scheduler" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.629885 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00e2cbe-314e-44e4-9c18-125827bb9e9a" containerName="nova-scheduler-scheduler" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.629926 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-log" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.629936 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-log" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.629957 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-log" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.629967 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-log" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.629991 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-metadata" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630000 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-metadata" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.630015 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-api" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630023 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-api" Dec 01 11:40:51 crc kubenswrapper[4958]: E1201 11:40:51.630041 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13e7610-d97c-4701-bfbc-a7e97d3f3909" containerName="nova-manage" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630049 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13e7610-d97c-4701-bfbc-a7e97d3f3909" containerName="nova-manage" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630270 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00e2cbe-314e-44e4-9c18-125827bb9e9a" containerName="nova-scheduler-scheduler" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630306 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-metadata" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630320 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-api" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630334 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="699fc36c-be0e-490d-ae04-867adb807761" containerName="nova-api-log" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630345 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" containerName="nova-metadata-log" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.630370 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13e7610-d97c-4701-bfbc-a7e97d3f3909" containerName="nova-manage" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.631151 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.637627 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.639188 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.645126 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.647273 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.651379 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.711460 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.713827 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.713883 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsf6\" (UniqueName: \"kubernetes.io/projected/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-kube-api-access-vrsf6\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.713909 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-config-data\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.713965 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-logs\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.810719 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699fc36c-be0e-490d-ae04-867adb807761" path="/var/lib/kubelet/pods/699fc36c-be0e-490d-ae04-867adb807761/volumes" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.812086 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00e2cbe-314e-44e4-9c18-125827bb9e9a" path="/var/lib/kubelet/pods/e00e2cbe-314e-44e4-9c18-125827bb9e9a/volumes" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815378 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jxk\" (UniqueName: \"kubernetes.io/projected/88a7ef3f-1cd5-4e71-b574-a52956739b72-kube-api-access-47jxk\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815443 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815494 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815522 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsf6\" (UniqueName: \"kubernetes.io/projected/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-kube-api-access-vrsf6\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815543 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-config-data\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815573 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-config-data\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.815623 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-logs\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.816255 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-logs\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.820038 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.826547 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-config-data\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.842668 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsf6\" (UniqueName: \"kubernetes.io/projected/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-kube-api-access-vrsf6\") pod \"nova-api-0\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " pod="openstack/nova-api-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.918264 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-config-data\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.918431 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jxk\" (UniqueName: \"kubernetes.io/projected/88a7ef3f-1cd5-4e71-b574-a52956739b72-kube-api-access-47jxk\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.918493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.922426 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-config-data\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.923810 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:51 crc kubenswrapper[4958]: I1201 11:40:51.937134 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jxk\" (UniqueName: \"kubernetes.io/projected/88a7ef3f-1cd5-4e71-b574-a52956739b72-kube-api-access-47jxk\") pod \"nova-scheduler-0\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " pod="openstack/nova-scheduler-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.071134 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.084264 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.247918 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0","Type":"ContainerDied","Data":"b8dbb21c2dc5a63a62092e0cdc94b7491402402bd7b7e647b1fb948f44b289bc"} Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.248203 4958 scope.go:117] "RemoveContainer" containerID="5e2c653883026b215cdf9538213435ef1fc498a14de21d36ab4f2e0fd745539c" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.248036 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.302417 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.310168 4958 scope.go:117] "RemoveContainer" containerID="d9b246cc1ab051ae21e36aa41cee94c6bf5431ba340f3a46b33b6a20498311c9" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.315410 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.341566 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.343290 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.345491 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.352702 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.428564 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.428657 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-logs\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.428686 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shl8\" (UniqueName: \"kubernetes.io/projected/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-kube-api-access-8shl8\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.428713 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-config-data\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.529997 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shl8\" (UniqueName: \"kubernetes.io/projected/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-kube-api-access-8shl8\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.530065 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-config-data\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.530151 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.530240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-logs\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.530775 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-logs\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.537039 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.539829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-config-data\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.548658 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shl8\" (UniqueName: \"kubernetes.io/projected/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-kube-api-access-8shl8\") pod \"nova-metadata-0\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " pod="openstack/nova-metadata-0" Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.594141 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.614701 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:40:52 crc kubenswrapper[4958]: W1201 11:40:52.632315 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88a7ef3f_1cd5_4e71_b574_a52956739b72.slice/crio-54fea45e70c00875ed28f4106a361d015482ae0420bd432c73890e787b998e42 WatchSource:0}: Error finding container 54fea45e70c00875ed28f4106a361d015482ae0420bd432c73890e787b998e42: Status 404 returned error can't find the container with id 54fea45e70c00875ed28f4106a361d015482ae0420bd432c73890e787b998e42 Dec 01 11:40:52 crc kubenswrapper[4958]: I1201 11:40:52.665159 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.203605 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.257943 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"88a7ef3f-1cd5-4e71-b574-a52956739b72","Type":"ContainerStarted","Data":"574c005c50fa3d85420d1633534ce487a7593d9b086f3ddcf98dce630755c61d"} Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.258069 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"88a7ef3f-1cd5-4e71-b574-a52956739b72","Type":"ContainerStarted","Data":"54fea45e70c00875ed28f4106a361d015482ae0420bd432c73890e787b998e42"} Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.259777 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd","Type":"ContainerStarted","Data":"7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a"} Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.259827 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd","Type":"ContainerStarted","Data":"842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a"} Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.259838 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd","Type":"ContainerStarted","Data":"3c9e4515bc24531058efa64351adf380efead8dd322192c7cc411b982ae22934"} Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.261264 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c5b597e-d210-46e3-8115-63c9e1a0d3f8","Type":"ContainerStarted","Data":"9bfb0f233f2702622e8a70624e26420f78190dffcac1bbefaaeff573ab71f078"} Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.283379 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.283350816 podStartE2EDuration="2.283350816s" podCreationTimestamp="2025-12-01 11:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:53.278315923 +0000 UTC m=+6100.787104960" watchObservedRunningTime="2025-12-01 11:40:53.283350816 +0000 UTC m=+6100.792139853" Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.305923 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.305903026 podStartE2EDuration="2.305903026s" podCreationTimestamp="2025-12-01 11:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:53.295985334 +0000 UTC m=+6100.804774371" watchObservedRunningTime="2025-12-01 11:40:53.305903026 +0000 UTC m=+6100.814692063" Dec 01 11:40:53 crc kubenswrapper[4958]: I1201 11:40:53.849612 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0" path="/var/lib/kubelet/pods/d3d9370a-9d7d-450f-a7cf-fdedc29c8ff0/volumes" Dec 01 11:40:54 crc kubenswrapper[4958]: I1201 11:40:54.326202 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c5b597e-d210-46e3-8115-63c9e1a0d3f8","Type":"ContainerStarted","Data":"a4f04ca1d5ddee31d8f3725f7f1ca800825217c26ce9021d82223ec233bc0491"} Dec 01 11:40:54 crc kubenswrapper[4958]: I1201 11:40:54.326280 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c5b597e-d210-46e3-8115-63c9e1a0d3f8","Type":"ContainerStarted","Data":"54f6cb4420ff9a9bd0758ea58648caec67bf1c96b04f154404936cf4141a0324"} Dec 01 11:40:54 crc kubenswrapper[4958]: I1201 11:40:54.354972 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.354946025 podStartE2EDuration="2.354946025s" podCreationTimestamp="2025-12-01 11:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:40:54.349707166 +0000 UTC m=+6101.858496223" watchObservedRunningTime="2025-12-01 11:40:54.354946025 +0000 UTC m=+6101.863735072" Dec 01 11:40:57 crc kubenswrapper[4958]: I1201 11:40:57.071645 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 11:40:57 crc kubenswrapper[4958]: I1201 11:40:57.666666 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:40:57 crc kubenswrapper[4958]: I1201 11:40:57.666934 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.072052 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.085521 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.085599 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.123393 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.449303 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.701104 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 11:41:02 crc kubenswrapper[4958]: I1201 11:41:02.701509 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 11:41:03 crc kubenswrapper[4958]: I1201 11:41:03.128060 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:41:03 crc kubenswrapper[4958]: I1201 11:41:03.169067 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:41:03 crc kubenswrapper[4958]: I1201 11:41:03.783062 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:41:03 crc kubenswrapper[4958]: I1201 11:41:03.783484 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.091262 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.094090 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.094564 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.099801 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.517519 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.523044 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.687484 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.689351 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.702355 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.723723 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8549f5b7-c7xt4"] Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.725451 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.759474 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8549f5b7-c7xt4"] Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.810725 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.810897 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.810984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-config\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.811167 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-748n4\" (UniqueName: \"kubernetes.io/projected/e3ab235d-21c4-409f-b09e-4531a1236742-kube-api-access-748n4\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.811245 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-dns-svc\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.912932 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.912995 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-config\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.913036 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-748n4\" (UniqueName: \"kubernetes.io/projected/e3ab235d-21c4-409f-b09e-4531a1236742-kube-api-access-748n4\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.913073 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-dns-svc\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.913123 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.913972 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-nb\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.914275 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-config\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.914351 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-dns-svc\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.914621 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-sb\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:12 crc kubenswrapper[4958]: I1201 11:41:12.932481 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-748n4\" (UniqueName: \"kubernetes.io/projected/e3ab235d-21c4-409f-b09e-4531a1236742-kube-api-access-748n4\") pod \"dnsmasq-dns-6f8549f5b7-c7xt4\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:13 crc kubenswrapper[4958]: I1201 11:41:13.054540 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:13 crc kubenswrapper[4958]: I1201 11:41:13.530712 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 11:41:13 crc kubenswrapper[4958]: I1201 11:41:13.629432 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8549f5b7-c7xt4"] Dec 01 11:41:13 crc kubenswrapper[4958]: W1201 11:41:13.637133 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ab235d_21c4_409f_b09e_4531a1236742.slice/crio-5c586ab7748ac2edf5cea76a1518ccad49689147233f9021cb46c19f79ad3d80 WatchSource:0}: Error finding container 5c586ab7748ac2edf5cea76a1518ccad49689147233f9021cb46c19f79ad3d80: Status 404 returned error can't find the container with id 5c586ab7748ac2edf5cea76a1518ccad49689147233f9021cb46c19f79ad3d80 Dec 01 11:41:14 crc kubenswrapper[4958]: I1201 11:41:14.542490 4958 generic.go:334] "Generic (PLEG): container finished" podID="e3ab235d-21c4-409f-b09e-4531a1236742" containerID="f1cbbb4809bfb9f22786b421bb92392c8f8fd1de595c6de23dacee186e6a9bc3" exitCode=0 Dec 01 11:41:14 crc kubenswrapper[4958]: I1201 11:41:14.542567 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" event={"ID":"e3ab235d-21c4-409f-b09e-4531a1236742","Type":"ContainerDied","Data":"f1cbbb4809bfb9f22786b421bb92392c8f8fd1de595c6de23dacee186e6a9bc3"} Dec 01 11:41:14 crc kubenswrapper[4958]: I1201 11:41:14.543133 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" event={"ID":"e3ab235d-21c4-409f-b09e-4531a1236742","Type":"ContainerStarted","Data":"5c586ab7748ac2edf5cea76a1518ccad49689147233f9021cb46c19f79ad3d80"} Dec 01 11:41:15 crc kubenswrapper[4958]: I1201 11:41:15.579500 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" event={"ID":"e3ab235d-21c4-409f-b09e-4531a1236742","Type":"ContainerStarted","Data":"efb65b4b9e3e3a2e790377a11e20e78922f8ac4906fe9e5ed5abce2baeb626ec"} Dec 01 11:41:15 crc kubenswrapper[4958]: I1201 11:41:15.581222 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:15 crc kubenswrapper[4958]: I1201 11:41:15.607075 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" podStartSLOduration=3.607049451 podStartE2EDuration="3.607049451s" podCreationTimestamp="2025-12-01 11:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:41:15.604361385 +0000 UTC m=+6123.113150422" watchObservedRunningTime="2025-12-01 11:41:15.607049451 +0000 UTC m=+6123.115838528" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.346046 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cj4xh"] Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.349513 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.364191 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cj4xh"] Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.513275 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6php7\" (UniqueName: \"kubernetes.io/projected/60942b38-3086-4839-93cf-97c57da84450-kube-api-access-6php7\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.513350 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-catalog-content\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.513415 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-utilities\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.615084 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6php7\" (UniqueName: \"kubernetes.io/projected/60942b38-3086-4839-93cf-97c57da84450-kube-api-access-6php7\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.615169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-catalog-content\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.615264 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-utilities\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.615764 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-catalog-content\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.615791 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-utilities\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.635491 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6php7\" (UniqueName: \"kubernetes.io/projected/60942b38-3086-4839-93cf-97c57da84450-kube-api-access-6php7\") pod \"certified-operators-cj4xh\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:21 crc kubenswrapper[4958]: I1201 11:41:21.692775 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:22 crc kubenswrapper[4958]: I1201 11:41:22.103964 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cj4xh"] Dec 01 11:41:22 crc kubenswrapper[4958]: I1201 11:41:22.667429 4958 generic.go:334] "Generic (PLEG): container finished" podID="60942b38-3086-4839-93cf-97c57da84450" containerID="b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958" exitCode=0 Dec 01 11:41:22 crc kubenswrapper[4958]: I1201 11:41:22.667477 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerDied","Data":"b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958"} Dec 01 11:41:22 crc kubenswrapper[4958]: I1201 11:41:22.668632 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerStarted","Data":"b3210ab0cf48058adec9fa57a48bd0d51ee527e771593021c631a1dba4ef37a9"} Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.056150 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.147154 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-968fc5485-pmcmj"] Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.147810 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerName="dnsmasq-dns" containerID="cri-o://3857b8805d12d606d1985dc4bf5d26ed238125b19ad706c5c17738db92615797" gracePeriod=10 Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.698031 4958 generic.go:334] "Generic (PLEG): container finished" podID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerID="3857b8805d12d606d1985dc4bf5d26ed238125b19ad706c5c17738db92615797" exitCode=0 Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.698529 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" event={"ID":"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb","Type":"ContainerDied","Data":"3857b8805d12d606d1985dc4bf5d26ed238125b19ad706c5c17738db92615797"} Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.698559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" event={"ID":"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb","Type":"ContainerDied","Data":"cc825789a70c692c03f17ba320f7129616f409d5599296a849d52fb6094f5b3c"} Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.698573 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc825789a70c692c03f17ba320f7129616f409d5599296a849d52fb6094f5b3c" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.700714 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.705489 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerStarted","Data":"5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c"} Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.879065 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-nb\") pod \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.879121 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-sb\") pod \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.879199 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-config\") pod \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.879269 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf2tf\" (UniqueName: \"kubernetes.io/projected/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-kube-api-access-tf2tf\") pod \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.879344 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-dns-svc\") pod \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\" (UID: \"1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb\") " Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.900208 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-kube-api-access-tf2tf" (OuterVolumeSpecName: "kube-api-access-tf2tf") pod "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" (UID: "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb"). InnerVolumeSpecName "kube-api-access-tf2tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.931298 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" (UID: "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.949371 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" (UID: "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.953334 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-config" (OuterVolumeSpecName: "config") pod "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" (UID: "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.969210 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" (UID: "1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.981823 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.981941 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.981959 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.981973 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:23 crc kubenswrapper[4958]: I1201 11:41:23.981988 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf2tf\" (UniqueName: \"kubernetes.io/projected/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb-kube-api-access-tf2tf\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:24 crc kubenswrapper[4958]: I1201 11:41:24.721737 4958 generic.go:334] "Generic (PLEG): container finished" podID="60942b38-3086-4839-93cf-97c57da84450" containerID="5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c" exitCode=0 Dec 01 11:41:24 crc kubenswrapper[4958]: I1201 11:41:24.722143 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-968fc5485-pmcmj" Dec 01 11:41:24 crc kubenswrapper[4958]: I1201 11:41:24.723711 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerDied","Data":"5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c"} Dec 01 11:41:24 crc kubenswrapper[4958]: I1201 11:41:24.802806 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-968fc5485-pmcmj"] Dec 01 11:41:24 crc kubenswrapper[4958]: I1201 11:41:24.814469 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-968fc5485-pmcmj"] Dec 01 11:41:25 crc kubenswrapper[4958]: I1201 11:41:25.732655 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerStarted","Data":"dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9"} Dec 01 11:41:25 crc kubenswrapper[4958]: I1201 11:41:25.761668 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cj4xh" podStartSLOduration=2.264029324 podStartE2EDuration="4.761645626s" podCreationTimestamp="2025-12-01 11:41:21 +0000 UTC" firstStartedPulling="2025-12-01 11:41:22.669137836 +0000 UTC m=+6130.177926873" lastFinishedPulling="2025-12-01 11:41:25.166754128 +0000 UTC m=+6132.675543175" observedRunningTime="2025-12-01 11:41:25.752960249 +0000 UTC m=+6133.261749296" watchObservedRunningTime="2025-12-01 11:41:25.761645626 +0000 UTC m=+6133.270434663" Dec 01 11:41:25 crc kubenswrapper[4958]: I1201 11:41:25.824696 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" path="/var/lib/kubelet/pods/1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb/volumes" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.229004 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ttq72"] Dec 01 11:41:27 crc kubenswrapper[4958]: E1201 11:41:27.229634 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerName="dnsmasq-dns" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.229647 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerName="dnsmasq-dns" Dec 01 11:41:27 crc kubenswrapper[4958]: E1201 11:41:27.229668 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerName="init" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.229674 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerName="init" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.229976 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdc13e8-dc8e-44e1-9a88-70f5cae7c6cb" containerName="dnsmasq-dns" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.230674 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.248469 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ttq72"] Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.344813 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blsg6\" (UniqueName: \"kubernetes.io/projected/492cddee-1cc4-49b1-8be6-078adc4fa108-kube-api-access-blsg6\") pod \"cinder-db-create-ttq72\" (UID: \"492cddee-1cc4-49b1-8be6-078adc4fa108\") " pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.446607 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blsg6\" (UniqueName: \"kubernetes.io/projected/492cddee-1cc4-49b1-8be6-078adc4fa108-kube-api-access-blsg6\") pod \"cinder-db-create-ttq72\" (UID: \"492cddee-1cc4-49b1-8be6-078adc4fa108\") " pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.466940 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blsg6\" (UniqueName: \"kubernetes.io/projected/492cddee-1cc4-49b1-8be6-078adc4fa108-kube-api-access-blsg6\") pod \"cinder-db-create-ttq72\" (UID: \"492cddee-1cc4-49b1-8be6-078adc4fa108\") " pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:27 crc kubenswrapper[4958]: I1201 11:41:27.549630 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:28 crc kubenswrapper[4958]: I1201 11:41:28.038471 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ttq72"] Dec 01 11:41:28 crc kubenswrapper[4958]: I1201 11:41:28.761295 4958 generic.go:334] "Generic (PLEG): container finished" podID="492cddee-1cc4-49b1-8be6-078adc4fa108" containerID="253a387d6e2a43ba415e346df479b409acb30bdb5936a2e10b5acc64f6bac87c" exitCode=0 Dec 01 11:41:28 crc kubenswrapper[4958]: I1201 11:41:28.761368 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ttq72" event={"ID":"492cddee-1cc4-49b1-8be6-078adc4fa108","Type":"ContainerDied","Data":"253a387d6e2a43ba415e346df479b409acb30bdb5936a2e10b5acc64f6bac87c"} Dec 01 11:41:28 crc kubenswrapper[4958]: I1201 11:41:28.761426 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ttq72" event={"ID":"492cddee-1cc4-49b1-8be6-078adc4fa108","Type":"ContainerStarted","Data":"55d454c282a60333548b4b8fe9607082fee8af2049dbef48c753bd3e3c0771ac"} Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.144365 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.321334 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blsg6\" (UniqueName: \"kubernetes.io/projected/492cddee-1cc4-49b1-8be6-078adc4fa108-kube-api-access-blsg6\") pod \"492cddee-1cc4-49b1-8be6-078adc4fa108\" (UID: \"492cddee-1cc4-49b1-8be6-078adc4fa108\") " Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.326510 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492cddee-1cc4-49b1-8be6-078adc4fa108-kube-api-access-blsg6" (OuterVolumeSpecName: "kube-api-access-blsg6") pod "492cddee-1cc4-49b1-8be6-078adc4fa108" (UID: "492cddee-1cc4-49b1-8be6-078adc4fa108"). InnerVolumeSpecName "kube-api-access-blsg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.423510 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blsg6\" (UniqueName: \"kubernetes.io/projected/492cddee-1cc4-49b1-8be6-078adc4fa108-kube-api-access-blsg6\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.782698 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ttq72" event={"ID":"492cddee-1cc4-49b1-8be6-078adc4fa108","Type":"ContainerDied","Data":"55d454c282a60333548b4b8fe9607082fee8af2049dbef48c753bd3e3c0771ac"} Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.782767 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ttq72" Dec 01 11:41:30 crc kubenswrapper[4958]: I1201 11:41:30.782783 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55d454c282a60333548b4b8fe9607082fee8af2049dbef48c753bd3e3c0771ac" Dec 01 11:41:31 crc kubenswrapper[4958]: I1201 11:41:31.694506 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:31 crc kubenswrapper[4958]: I1201 11:41:31.695212 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:31 crc kubenswrapper[4958]: I1201 11:41:31.777790 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:31 crc kubenswrapper[4958]: I1201 11:41:31.865713 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:32 crc kubenswrapper[4958]: I1201 11:41:32.019606 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cj4xh"] Dec 01 11:41:33 crc kubenswrapper[4958]: I1201 11:41:33.849152 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cj4xh" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="registry-server" containerID="cri-o://dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9" gracePeriod=2 Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.413746 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.553797 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-utilities\") pod \"60942b38-3086-4839-93cf-97c57da84450\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.554194 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6php7\" (UniqueName: \"kubernetes.io/projected/60942b38-3086-4839-93cf-97c57da84450-kube-api-access-6php7\") pod \"60942b38-3086-4839-93cf-97c57da84450\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.554300 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-catalog-content\") pod \"60942b38-3086-4839-93cf-97c57da84450\" (UID: \"60942b38-3086-4839-93cf-97c57da84450\") " Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.555102 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-utilities" (OuterVolumeSpecName: "utilities") pod "60942b38-3086-4839-93cf-97c57da84450" (UID: "60942b38-3086-4839-93cf-97c57da84450"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.562116 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60942b38-3086-4839-93cf-97c57da84450-kube-api-access-6php7" (OuterVolumeSpecName: "kube-api-access-6php7") pod "60942b38-3086-4839-93cf-97c57da84450" (UID: "60942b38-3086-4839-93cf-97c57da84450"). InnerVolumeSpecName "kube-api-access-6php7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.625376 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60942b38-3086-4839-93cf-97c57da84450" (UID: "60942b38-3086-4839-93cf-97c57da84450"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.655937 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.655981 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60942b38-3086-4839-93cf-97c57da84450-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.655998 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6php7\" (UniqueName: \"kubernetes.io/projected/60942b38-3086-4839-93cf-97c57da84450-kube-api-access-6php7\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.861724 4958 generic.go:334] "Generic (PLEG): container finished" podID="60942b38-3086-4839-93cf-97c57da84450" containerID="dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9" exitCode=0 Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.861765 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerDied","Data":"dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9"} Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.861790 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cj4xh" event={"ID":"60942b38-3086-4839-93cf-97c57da84450","Type":"ContainerDied","Data":"b3210ab0cf48058adec9fa57a48bd0d51ee527e771593021c631a1dba4ef37a9"} Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.861807 4958 scope.go:117] "RemoveContainer" containerID="dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.861918 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cj4xh" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.888821 4958 scope.go:117] "RemoveContainer" containerID="5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.900803 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cj4xh"] Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.907858 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cj4xh"] Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.914286 4958 scope.go:117] "RemoveContainer" containerID="b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.947634 4958 scope.go:117] "RemoveContainer" containerID="dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9" Dec 01 11:41:34 crc kubenswrapper[4958]: E1201 11:41:34.948095 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9\": container with ID starting with dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9 not found: ID does not exist" containerID="dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.948154 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9"} err="failed to get container status \"dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9\": rpc error: code = NotFound desc = could not find container \"dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9\": container with ID starting with dbaa71be5c680b9947ce5f6e4b0c646b176ae7bc7ae331e3c3b8503ffd38dcc9 not found: ID does not exist" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.948184 4958 scope.go:117] "RemoveContainer" containerID="5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c" Dec 01 11:41:34 crc kubenswrapper[4958]: E1201 11:41:34.948740 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c\": container with ID starting with 5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c not found: ID does not exist" containerID="5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.948786 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c"} err="failed to get container status \"5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c\": rpc error: code = NotFound desc = could not find container \"5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c\": container with ID starting with 5ee5bcc5201e8a03efff3df929e016802b44c3c73568b9e3d457d5fef9b9820c not found: ID does not exist" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.948816 4958 scope.go:117] "RemoveContainer" containerID="b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958" Dec 01 11:41:34 crc kubenswrapper[4958]: E1201 11:41:34.949243 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958\": container with ID starting with b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958 not found: ID does not exist" containerID="b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958" Dec 01 11:41:34 crc kubenswrapper[4958]: I1201 11:41:34.949332 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958"} err="failed to get container status \"b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958\": rpc error: code = NotFound desc = could not find container \"b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958\": container with ID starting with b0caa24f1bbc835139a0cf782e492e2ea1bd93e015083ffa9a2a321fdc06c958 not found: ID does not exist" Dec 01 11:41:35 crc kubenswrapper[4958]: I1201 11:41:35.812006 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60942b38-3086-4839-93cf-97c57da84450" path="/var/lib/kubelet/pods/60942b38-3086-4839-93cf-97c57da84450/volumes" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.314653 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-728c-account-create-d65k4"] Dec 01 11:41:37 crc kubenswrapper[4958]: E1201 11:41:37.315208 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="extract-utilities" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.315226 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="extract-utilities" Dec 01 11:41:37 crc kubenswrapper[4958]: E1201 11:41:37.315251 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492cddee-1cc4-49b1-8be6-078adc4fa108" containerName="mariadb-database-create" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.315270 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="492cddee-1cc4-49b1-8be6-078adc4fa108" containerName="mariadb-database-create" Dec 01 11:41:37 crc kubenswrapper[4958]: E1201 11:41:37.315289 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="extract-content" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.315298 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="extract-content" Dec 01 11:41:37 crc kubenswrapper[4958]: E1201 11:41:37.315324 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="registry-server" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.315331 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="registry-server" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.315560 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="492cddee-1cc4-49b1-8be6-078adc4fa108" containerName="mariadb-database-create" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.315584 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="60942b38-3086-4839-93cf-97c57da84450" containerName="registry-server" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.316407 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.319889 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.325055 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-728c-account-create-d65k4"] Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.422452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f6cb\" (UniqueName: \"kubernetes.io/projected/ce588a72-84ad-4d78-a641-4bd415c7ff03-kube-api-access-5f6cb\") pod \"cinder-728c-account-create-d65k4\" (UID: \"ce588a72-84ad-4d78-a641-4bd415c7ff03\") " pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.524676 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f6cb\" (UniqueName: \"kubernetes.io/projected/ce588a72-84ad-4d78-a641-4bd415c7ff03-kube-api-access-5f6cb\") pod \"cinder-728c-account-create-d65k4\" (UID: \"ce588a72-84ad-4d78-a641-4bd415c7ff03\") " pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.557623 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f6cb\" (UniqueName: \"kubernetes.io/projected/ce588a72-84ad-4d78-a641-4bd415c7ff03-kube-api-access-5f6cb\") pod \"cinder-728c-account-create-d65k4\" (UID: \"ce588a72-84ad-4d78-a641-4bd415c7ff03\") " pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:37 crc kubenswrapper[4958]: I1201 11:41:37.640741 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:38 crc kubenswrapper[4958]: I1201 11:41:38.229817 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-728c-account-create-d65k4"] Dec 01 11:41:38 crc kubenswrapper[4958]: I1201 11:41:38.930825 4958 generic.go:334] "Generic (PLEG): container finished" podID="ce588a72-84ad-4d78-a641-4bd415c7ff03" containerID="433ee3aa21c9e05d2b5fc308fd14344d8440ba40d7125e0c83738b7900e9418e" exitCode=0 Dec 01 11:41:38 crc kubenswrapper[4958]: I1201 11:41:38.930988 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-728c-account-create-d65k4" event={"ID":"ce588a72-84ad-4d78-a641-4bd415c7ff03","Type":"ContainerDied","Data":"433ee3aa21c9e05d2b5fc308fd14344d8440ba40d7125e0c83738b7900e9418e"} Dec 01 11:41:38 crc kubenswrapper[4958]: I1201 11:41:38.931139 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-728c-account-create-d65k4" event={"ID":"ce588a72-84ad-4d78-a641-4bd415c7ff03","Type":"ContainerStarted","Data":"5c1acaf5f2564c54432b9502e77d212ae0ed9f89b52f8317ec0ac89689da27ea"} Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.436452 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.498489 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f6cb\" (UniqueName: \"kubernetes.io/projected/ce588a72-84ad-4d78-a641-4bd415c7ff03-kube-api-access-5f6cb\") pod \"ce588a72-84ad-4d78-a641-4bd415c7ff03\" (UID: \"ce588a72-84ad-4d78-a641-4bd415c7ff03\") " Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.508332 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce588a72-84ad-4d78-a641-4bd415c7ff03-kube-api-access-5f6cb" (OuterVolumeSpecName: "kube-api-access-5f6cb") pod "ce588a72-84ad-4d78-a641-4bd415c7ff03" (UID: "ce588a72-84ad-4d78-a641-4bd415c7ff03"). InnerVolumeSpecName "kube-api-access-5f6cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.599548 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f6cb\" (UniqueName: \"kubernetes.io/projected/ce588a72-84ad-4d78-a641-4bd415c7ff03-kube-api-access-5f6cb\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.954030 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-728c-account-create-d65k4" event={"ID":"ce588a72-84ad-4d78-a641-4bd415c7ff03","Type":"ContainerDied","Data":"5c1acaf5f2564c54432b9502e77d212ae0ed9f89b52f8317ec0ac89689da27ea"} Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.954122 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c1acaf5f2564c54432b9502e77d212ae0ed9f89b52f8317ec0ac89689da27ea" Dec 01 11:41:40 crc kubenswrapper[4958]: I1201 11:41:40.954193 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-728c-account-create-d65k4" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.462788 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-pbb4g"] Dec 01 11:41:42 crc kubenswrapper[4958]: E1201 11:41:42.463449 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce588a72-84ad-4d78-a641-4bd415c7ff03" containerName="mariadb-account-create" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.463463 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce588a72-84ad-4d78-a641-4bd415c7ff03" containerName="mariadb-account-create" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.463662 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce588a72-84ad-4d78-a641-4bd415c7ff03" containerName="mariadb-account-create" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.464369 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.466972 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.467136 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.467736 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-b864r" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.477170 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-pbb4g"] Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.638452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-combined-ca-bundle\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.638553 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-config-data\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.638860 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tjlj\" (UniqueName: \"kubernetes.io/projected/38f0913e-7c66-4a70-866c-6fa041c5a169-kube-api-access-4tjlj\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.639006 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0913e-7c66-4a70-866c-6fa041c5a169-etc-machine-id\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.639093 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-scripts\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.639341 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-db-sync-config-data\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.741352 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-db-sync-config-data\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.741420 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-combined-ca-bundle\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.741474 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-config-data\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.741507 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tjlj\" (UniqueName: \"kubernetes.io/projected/38f0913e-7c66-4a70-866c-6fa041c5a169-kube-api-access-4tjlj\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.741546 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0913e-7c66-4a70-866c-6fa041c5a169-etc-machine-id\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.741575 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-scripts\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.742540 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0913e-7c66-4a70-866c-6fa041c5a169-etc-machine-id\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.748468 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-db-sync-config-data\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.748500 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-scripts\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.748828 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-config-data\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.753447 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-combined-ca-bundle\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.769399 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tjlj\" (UniqueName: \"kubernetes.io/projected/38f0913e-7c66-4a70-866c-6fa041c5a169-kube-api-access-4tjlj\") pod \"cinder-db-sync-pbb4g\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:42 crc kubenswrapper[4958]: I1201 11:41:42.787982 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:43 crc kubenswrapper[4958]: I1201 11:41:43.339801 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-pbb4g"] Dec 01 11:41:43 crc kubenswrapper[4958]: W1201 11:41:43.347208 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38f0913e_7c66_4a70_866c_6fa041c5a169.slice/crio-6ba5aca0ccdba27df09efc73c772bab2ace4307ccead76f73a8fdcde4d9ae65b WatchSource:0}: Error finding container 6ba5aca0ccdba27df09efc73c772bab2ace4307ccead76f73a8fdcde4d9ae65b: Status 404 returned error can't find the container with id 6ba5aca0ccdba27df09efc73c772bab2ace4307ccead76f73a8fdcde4d9ae65b Dec 01 11:41:43 crc kubenswrapper[4958]: I1201 11:41:43.997462 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pbb4g" event={"ID":"38f0913e-7c66-4a70-866c-6fa041c5a169","Type":"ContainerStarted","Data":"6ba5aca0ccdba27df09efc73c772bab2ace4307ccead76f73a8fdcde4d9ae65b"} Dec 01 11:41:45 crc kubenswrapper[4958]: I1201 11:41:45.007036 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pbb4g" event={"ID":"38f0913e-7c66-4a70-866c-6fa041c5a169","Type":"ContainerStarted","Data":"91afaee0e12431a85fdc4e2a2e3fe78638367e7086d12a1eaeb966d9c084fdfc"} Dec 01 11:41:45 crc kubenswrapper[4958]: I1201 11:41:45.039091 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-pbb4g" podStartSLOduration=3.039062044 podStartE2EDuration="3.039062044s" podCreationTimestamp="2025-12-01 11:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:41:45.03117137 +0000 UTC m=+6152.539960407" watchObservedRunningTime="2025-12-01 11:41:45.039062044 +0000 UTC m=+6152.547851101" Dec 01 11:41:48 crc kubenswrapper[4958]: I1201 11:41:48.083448 4958 generic.go:334] "Generic (PLEG): container finished" podID="38f0913e-7c66-4a70-866c-6fa041c5a169" containerID="91afaee0e12431a85fdc4e2a2e3fe78638367e7086d12a1eaeb966d9c084fdfc" exitCode=0 Dec 01 11:41:48 crc kubenswrapper[4958]: I1201 11:41:48.083828 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pbb4g" event={"ID":"38f0913e-7c66-4a70-866c-6fa041c5a169","Type":"ContainerDied","Data":"91afaee0e12431a85fdc4e2a2e3fe78638367e7086d12a1eaeb966d9c084fdfc"} Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.552514 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693278 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-db-sync-config-data\") pod \"38f0913e-7c66-4a70-866c-6fa041c5a169\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693412 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0913e-7c66-4a70-866c-6fa041c5a169-etc-machine-id\") pod \"38f0913e-7c66-4a70-866c-6fa041c5a169\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693463 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tjlj\" (UniqueName: \"kubernetes.io/projected/38f0913e-7c66-4a70-866c-6fa041c5a169-kube-api-access-4tjlj\") pod \"38f0913e-7c66-4a70-866c-6fa041c5a169\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693544 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38f0913e-7c66-4a70-866c-6fa041c5a169-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "38f0913e-7c66-4a70-866c-6fa041c5a169" (UID: "38f0913e-7c66-4a70-866c-6fa041c5a169"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693578 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-scripts\") pod \"38f0913e-7c66-4a70-866c-6fa041c5a169\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693640 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-combined-ca-bundle\") pod \"38f0913e-7c66-4a70-866c-6fa041c5a169\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.693683 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-config-data\") pod \"38f0913e-7c66-4a70-866c-6fa041c5a169\" (UID: \"38f0913e-7c66-4a70-866c-6fa041c5a169\") " Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.694145 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0913e-7c66-4a70-866c-6fa041c5a169-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.699329 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-scripts" (OuterVolumeSpecName: "scripts") pod "38f0913e-7c66-4a70-866c-6fa041c5a169" (UID: "38f0913e-7c66-4a70-866c-6fa041c5a169"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.700191 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "38f0913e-7c66-4a70-866c-6fa041c5a169" (UID: "38f0913e-7c66-4a70-866c-6fa041c5a169"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.701351 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f0913e-7c66-4a70-866c-6fa041c5a169-kube-api-access-4tjlj" (OuterVolumeSpecName: "kube-api-access-4tjlj") pod "38f0913e-7c66-4a70-866c-6fa041c5a169" (UID: "38f0913e-7c66-4a70-866c-6fa041c5a169"). InnerVolumeSpecName "kube-api-access-4tjlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.725631 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38f0913e-7c66-4a70-866c-6fa041c5a169" (UID: "38f0913e-7c66-4a70-866c-6fa041c5a169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.770065 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-config-data" (OuterVolumeSpecName: "config-data") pod "38f0913e-7c66-4a70-866c-6fa041c5a169" (UID: "38f0913e-7c66-4a70-866c-6fa041c5a169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.795760 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.795820 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.795831 4958 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.795861 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tjlj\" (UniqueName: \"kubernetes.io/projected/38f0913e-7c66-4a70-866c-6fa041c5a169-kube-api-access-4tjlj\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:49 crc kubenswrapper[4958]: I1201 11:41:49.795876 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0913e-7c66-4a70-866c-6fa041c5a169-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.112713 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-pbb4g" event={"ID":"38f0913e-7c66-4a70-866c-6fa041c5a169","Type":"ContainerDied","Data":"6ba5aca0ccdba27df09efc73c772bab2ace4307ccead76f73a8fdcde4d9ae65b"} Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.112761 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba5aca0ccdba27df09efc73c772bab2ace4307ccead76f73a8fdcde4d9ae65b" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.112782 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-pbb4g" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.508071 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c5cb9b577-kfwp4"] Dec 01 11:41:50 crc kubenswrapper[4958]: E1201 11:41:50.508580 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f0913e-7c66-4a70-866c-6fa041c5a169" containerName="cinder-db-sync" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.508602 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f0913e-7c66-4a70-866c-6fa041c5a169" containerName="cinder-db-sync" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.508870 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f0913e-7c66-4a70-866c-6fa041c5a169" containerName="cinder-db-sync" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.510100 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.518478 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c5cb9b577-kfwp4"] Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.519450 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-dns-svc\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.519527 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-nb\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.519590 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-sb\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.519654 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-config\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.519683 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt6jz\" (UniqueName: \"kubernetes.io/projected/aebcbc29-628a-4001-aa42-8434eec96ebc-kube-api-access-jt6jz\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.624641 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-dns-svc\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.625021 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-nb\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.625107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-sb\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.625198 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-config\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.625249 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt6jz\" (UniqueName: \"kubernetes.io/projected/aebcbc29-628a-4001-aa42-8434eec96ebc-kube-api-access-jt6jz\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.626680 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-dns-svc\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.627664 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-nb\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.627938 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-sb\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.627987 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-config\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.644699 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.646273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt6jz\" (UniqueName: \"kubernetes.io/projected/aebcbc29-628a-4001-aa42-8434eec96ebc-kube-api-access-jt6jz\") pod \"dnsmasq-dns-7c5cb9b577-kfwp4\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.646299 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.653268 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.653575 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.653667 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.653756 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-b864r" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.656532 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.837516 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.894369 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.894411 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-scripts\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.894432 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data-custom\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.894470 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7607cce4-9386-41b0-aa60-4993250abc80-logs\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.894643 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.894729 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7607cce4-9386-41b0-aa60-4993250abc80-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.895009 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjh9v\" (UniqueName: \"kubernetes.io/projected/7607cce4-9386-41b0-aa60-4993250abc80-kube-api-access-tjh9v\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.996999 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data-custom\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997317 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7607cce4-9386-41b0-aa60-4993250abc80-logs\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997349 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997372 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7607cce4-9386-41b0-aa60-4993250abc80-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997436 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjh9v\" (UniqueName: \"kubernetes.io/projected/7607cce4-9386-41b0-aa60-4993250abc80-kube-api-access-tjh9v\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997520 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997541 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-scripts\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:50 crc kubenswrapper[4958]: I1201 11:41:50.997900 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7607cce4-9386-41b0-aa60-4993250abc80-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.000121 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7607cce4-9386-41b0-aa60-4993250abc80-logs\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.005186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.005644 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.005882 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data-custom\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.009976 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-scripts\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.037406 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjh9v\" (UniqueName: \"kubernetes.io/projected/7607cce4-9386-41b0-aa60-4993250abc80-kube-api-access-tjh9v\") pod \"cinder-api-0\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.109452 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.357782 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c5cb9b577-kfwp4"] Dec 01 11:41:51 crc kubenswrapper[4958]: W1201 11:41:51.362096 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaebcbc29_628a_4001_aa42_8434eec96ebc.slice/crio-2e2828b0eb574efaf9c1ce9f71a05559e7c7a2043fafb1c7db32de64d5fe9da4 WatchSource:0}: Error finding container 2e2828b0eb574efaf9c1ce9f71a05559e7c7a2043fafb1c7db32de64d5fe9da4: Status 404 returned error can't find the container with id 2e2828b0eb574efaf9c1ce9f71a05559e7c7a2043fafb1c7db32de64d5fe9da4 Dec 01 11:41:51 crc kubenswrapper[4958]: I1201 11:41:51.663114 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:41:51 crc kubenswrapper[4958]: W1201 11:41:51.668411 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7607cce4_9386_41b0_aa60_4993250abc80.slice/crio-abe13bff091b3e72f4f69bc38e4b67e3cf984e66376d2929de02ce351a8e915b WatchSource:0}: Error finding container abe13bff091b3e72f4f69bc38e4b67e3cf984e66376d2929de02ce351a8e915b: Status 404 returned error can't find the container with id abe13bff091b3e72f4f69bc38e4b67e3cf984e66376d2929de02ce351a8e915b Dec 01 11:41:52 crc kubenswrapper[4958]: I1201 11:41:52.189807 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7607cce4-9386-41b0-aa60-4993250abc80","Type":"ContainerStarted","Data":"abe13bff091b3e72f4f69bc38e4b67e3cf984e66376d2929de02ce351a8e915b"} Dec 01 11:41:52 crc kubenswrapper[4958]: I1201 11:41:52.205174 4958 generic.go:334] "Generic (PLEG): container finished" podID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerID="c5af407014dadd8ae4487e03a9c35ac6cea1a92816885311f998510384e331b6" exitCode=0 Dec 01 11:41:52 crc kubenswrapper[4958]: I1201 11:41:52.205251 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" event={"ID":"aebcbc29-628a-4001-aa42-8434eec96ebc","Type":"ContainerDied","Data":"c5af407014dadd8ae4487e03a9c35ac6cea1a92816885311f998510384e331b6"} Dec 01 11:41:52 crc kubenswrapper[4958]: I1201 11:41:52.205306 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" event={"ID":"aebcbc29-628a-4001-aa42-8434eec96ebc","Type":"ContainerStarted","Data":"2e2828b0eb574efaf9c1ce9f71a05559e7c7a2043fafb1c7db32de64d5fe9da4"} Dec 01 11:41:53 crc kubenswrapper[4958]: I1201 11:41:53.230807 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7607cce4-9386-41b0-aa60-4993250abc80","Type":"ContainerStarted","Data":"91a9f05c28b8c841cd918ba669a942a507ad7cd5f755e3e0a916d9ae82816934"} Dec 01 11:41:53 crc kubenswrapper[4958]: I1201 11:41:53.248579 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" event={"ID":"aebcbc29-628a-4001-aa42-8434eec96ebc","Type":"ContainerStarted","Data":"2f099fd612b4b532b2f319f5b523be15b81d5eff9b8750f149bfdabba83f98f6"} Dec 01 11:41:53 crc kubenswrapper[4958]: I1201 11:41:53.249475 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:41:53 crc kubenswrapper[4958]: I1201 11:41:53.277282 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" podStartSLOduration=3.277260107 podStartE2EDuration="3.277260107s" podCreationTimestamp="2025-12-01 11:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:41:53.275546328 +0000 UTC m=+6160.784335365" watchObservedRunningTime="2025-12-01 11:41:53.277260107 +0000 UTC m=+6160.786049144" Dec 01 11:41:54 crc kubenswrapper[4958]: I1201 11:41:54.276168 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7607cce4-9386-41b0-aa60-4993250abc80","Type":"ContainerStarted","Data":"3d5f046d24182d8829b81adb9591424bc44352883247f53fb1573a4de2b87699"} Dec 01 11:41:54 crc kubenswrapper[4958]: I1201 11:41:54.310463 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.310447257 podStartE2EDuration="4.310447257s" podCreationTimestamp="2025-12-01 11:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:41:54.307451191 +0000 UTC m=+6161.816240229" watchObservedRunningTime="2025-12-01 11:41:54.310447257 +0000 UTC m=+6161.819236284" Dec 01 11:41:55 crc kubenswrapper[4958]: I1201 11:41:55.290108 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 11:41:58 crc kubenswrapper[4958]: I1201 11:41:58.210602 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:41:58 crc kubenswrapper[4958]: I1201 11:41:58.211073 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:42:00 crc kubenswrapper[4958]: I1201 11:42:00.840089 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:42:00 crc kubenswrapper[4958]: I1201 11:42:00.937770 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8549f5b7-c7xt4"] Dec 01 11:42:00 crc kubenswrapper[4958]: I1201 11:42:00.938348 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" containerName="dnsmasq-dns" containerID="cri-o://efb65b4b9e3e3a2e790377a11e20e78922f8ac4906fe9e5ed5abce2baeb626ec" gracePeriod=10 Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.354572 4958 generic.go:334] "Generic (PLEG): container finished" podID="e3ab235d-21c4-409f-b09e-4531a1236742" containerID="efb65b4b9e3e3a2e790377a11e20e78922f8ac4906fe9e5ed5abce2baeb626ec" exitCode=0 Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.354628 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" event={"ID":"e3ab235d-21c4-409f-b09e-4531a1236742","Type":"ContainerDied","Data":"efb65b4b9e3e3a2e790377a11e20e78922f8ac4906fe9e5ed5abce2baeb626ec"} Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.656169 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.740301 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-nb\") pod \"e3ab235d-21c4-409f-b09e-4531a1236742\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.740427 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-dns-svc\") pod \"e3ab235d-21c4-409f-b09e-4531a1236742\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.740462 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-config\") pod \"e3ab235d-21c4-409f-b09e-4531a1236742\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.740543 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-sb\") pod \"e3ab235d-21c4-409f-b09e-4531a1236742\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.740609 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-748n4\" (UniqueName: \"kubernetes.io/projected/e3ab235d-21c4-409f-b09e-4531a1236742-kube-api-access-748n4\") pod \"e3ab235d-21c4-409f-b09e-4531a1236742\" (UID: \"e3ab235d-21c4-409f-b09e-4531a1236742\") " Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.748156 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ab235d-21c4-409f-b09e-4531a1236742-kube-api-access-748n4" (OuterVolumeSpecName: "kube-api-access-748n4") pod "e3ab235d-21c4-409f-b09e-4531a1236742" (UID: "e3ab235d-21c4-409f-b09e-4531a1236742"). InnerVolumeSpecName "kube-api-access-748n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.790813 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e3ab235d-21c4-409f-b09e-4531a1236742" (UID: "e3ab235d-21c4-409f-b09e-4531a1236742"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.808659 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-config" (OuterVolumeSpecName: "config") pod "e3ab235d-21c4-409f-b09e-4531a1236742" (UID: "e3ab235d-21c4-409f-b09e-4531a1236742"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.813202 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e3ab235d-21c4-409f-b09e-4531a1236742" (UID: "e3ab235d-21c4-409f-b09e-4531a1236742"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.842569 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.843412 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.843511 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.843595 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-748n4\" (UniqueName: \"kubernetes.io/projected/e3ab235d-21c4-409f-b09e-4531a1236742-kube-api-access-748n4\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.851218 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3ab235d-21c4-409f-b09e-4531a1236742" (UID: "e3ab235d-21c4-409f-b09e-4531a1236742"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:42:01 crc kubenswrapper[4958]: I1201 11:42:01.945656 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ab235d-21c4-409f-b09e-4531a1236742-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.366649 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" event={"ID":"e3ab235d-21c4-409f-b09e-4531a1236742","Type":"ContainerDied","Data":"5c586ab7748ac2edf5cea76a1518ccad49689147233f9021cb46c19f79ad3d80"} Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.366958 4958 scope.go:117] "RemoveContainer" containerID="efb65b4b9e3e3a2e790377a11e20e78922f8ac4906fe9e5ed5abce2baeb626ec" Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.366771 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8549f5b7-c7xt4" Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.415390 4958 scope.go:117] "RemoveContainer" containerID="f1cbbb4809bfb9f22786b421bb92392c8f8fd1de595c6de23dacee186e6a9bc3" Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.417122 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8549f5b7-c7xt4"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.433676 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8549f5b7-c7xt4"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.774661 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.775006 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="88a7ef3f-1cd5-4e71-b574-a52956739b72" containerName="nova-scheduler-scheduler" containerID="cri-o://574c005c50fa3d85420d1633534ce487a7593d9b086f3ddcf98dce630755c61d" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.786927 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.787191 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="e20bdad3-ef47-4629-8812-d4c6f7345279" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.792562 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.792891 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-log" containerID="cri-o://54f6cb4420ff9a9bd0758ea58648caec67bf1c96b04f154404936cf4141a0324" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.793096 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-metadata" containerID="cri-o://a4f04ca1d5ddee31d8f3725f7f1ca800825217c26ce9021d82223ec233bc0491" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.805923 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.806152 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f5fd96ac-8e63-438e-8f66-8435513697c9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ca832a8e91d28ac78ad28b166b656a88ef9ac2b9b6f8713ff67ab107f0e7be2d" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.819806 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.820077 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-log" containerID="cri-o://842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.820236 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-api" containerID="cri-o://7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a" gracePeriod=30 Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.866595 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:42:02 crc kubenswrapper[4958]: I1201 11:42:02.866883 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="515ccd90-5b0b-43a2-a599-992fa27dd517" containerName="nova-cell1-conductor-conductor" containerID="cri-o://216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" gracePeriod=30 Dec 01 11:42:03 crc kubenswrapper[4958]: E1201 11:42:03.137604 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 11:42:03 crc kubenswrapper[4958]: E1201 11:42:03.143294 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 11:42:03 crc kubenswrapper[4958]: E1201 11:42:03.145065 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 11:42:03 crc kubenswrapper[4958]: E1201 11:42:03.145122 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e20bdad3-ef47-4629-8812-d4c6f7345279" containerName="nova-cell0-conductor-conductor" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.362924 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.378887 4958 generic.go:334] "Generic (PLEG): container finished" podID="f5fd96ac-8e63-438e-8f66-8435513697c9" containerID="ca832a8e91d28ac78ad28b166b656a88ef9ac2b9b6f8713ff67ab107f0e7be2d" exitCode=0 Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.378975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f5fd96ac-8e63-438e-8f66-8435513697c9","Type":"ContainerDied","Data":"ca832a8e91d28ac78ad28b166b656a88ef9ac2b9b6f8713ff67ab107f0e7be2d"} Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.390516 4958 generic.go:334] "Generic (PLEG): container finished" podID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerID="842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a" exitCode=143 Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.390629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd","Type":"ContainerDied","Data":"842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a"} Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.407289 4958 generic.go:334] "Generic (PLEG): container finished" podID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerID="54f6cb4420ff9a9bd0758ea58648caec67bf1c96b04f154404936cf4141a0324" exitCode=143 Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.407339 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c5b597e-d210-46e3-8115-63c9e1a0d3f8","Type":"ContainerDied","Data":"54f6cb4420ff9a9bd0758ea58648caec67bf1c96b04f154404936cf4141a0324"} Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.752488 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.808584 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" path="/var/lib/kubelet/pods/e3ab235d-21c4-409f-b09e-4531a1236742/volumes" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.844139 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-config-data\") pod \"f5fd96ac-8e63-438e-8f66-8435513697c9\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.889515 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-config-data" (OuterVolumeSpecName: "config-data") pod "f5fd96ac-8e63-438e-8f66-8435513697c9" (UID: "f5fd96ac-8e63-438e-8f66-8435513697c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.945390 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-combined-ca-bundle\") pod \"f5fd96ac-8e63-438e-8f66-8435513697c9\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.945472 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d725w\" (UniqueName: \"kubernetes.io/projected/f5fd96ac-8e63-438e-8f66-8435513697c9-kube-api-access-d725w\") pod \"f5fd96ac-8e63-438e-8f66-8435513697c9\" (UID: \"f5fd96ac-8e63-438e-8f66-8435513697c9\") " Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.945802 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.953219 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5fd96ac-8e63-438e-8f66-8435513697c9-kube-api-access-d725w" (OuterVolumeSpecName: "kube-api-access-d725w") pod "f5fd96ac-8e63-438e-8f66-8435513697c9" (UID: "f5fd96ac-8e63-438e-8f66-8435513697c9"). InnerVolumeSpecName "kube-api-access-d725w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:03 crc kubenswrapper[4958]: I1201 11:42:03.974027 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5fd96ac-8e63-438e-8f66-8435513697c9" (UID: "f5fd96ac-8e63-438e-8f66-8435513697c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.047285 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5fd96ac-8e63-438e-8f66-8435513697c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.047348 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d725w\" (UniqueName: \"kubernetes.io/projected/f5fd96ac-8e63-438e-8f66-8435513697c9-kube-api-access-d725w\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.416646 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f5fd96ac-8e63-438e-8f66-8435513697c9","Type":"ContainerDied","Data":"3858d62464f0b1fbbeb2455752165250fe1dd8c46ae34ad79da272e77cca4ada"} Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.416701 4958 scope.go:117] "RemoveContainer" containerID="ca832a8e91d28ac78ad28b166b656a88ef9ac2b9b6f8713ff67ab107f0e7be2d" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.416730 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.470452 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.490640 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.500101 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:42:04 crc kubenswrapper[4958]: E1201 11:42:04.500603 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" containerName="init" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.500622 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" containerName="init" Dec 01 11:42:04 crc kubenswrapper[4958]: E1201 11:42:04.500632 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5fd96ac-8e63-438e-8f66-8435513697c9" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.500640 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5fd96ac-8e63-438e-8f66-8435513697c9" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 11:42:04 crc kubenswrapper[4958]: E1201 11:42:04.500676 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" containerName="dnsmasq-dns" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.500683 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" containerName="dnsmasq-dns" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.500878 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5fd96ac-8e63-438e-8f66-8435513697c9" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.500891 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ab235d-21c4-409f-b09e-4531a1236742" containerName="dnsmasq-dns" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.501649 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.504628 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.509756 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.733874 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.734003 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.734037 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvflr\" (UniqueName: \"kubernetes.io/projected/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-kube-api-access-xvflr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.835611 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.835718 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.835752 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvflr\" (UniqueName: \"kubernetes.io/projected/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-kube-api-access-xvflr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.841242 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.844463 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:04 crc kubenswrapper[4958]: I1201 11:42:04.855571 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvflr\" (UniqueName: \"kubernetes.io/projected/8a49ee43-f7d8-41b0-b8e6-273f3acdba79-kube-api-access-xvflr\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a49ee43-f7d8-41b0-b8e6-273f3acdba79\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:05 crc kubenswrapper[4958]: I1201 11:42:05.153057 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:05 crc kubenswrapper[4958]: I1201 11:42:05.684994 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 11:42:05 crc kubenswrapper[4958]: W1201 11:42:05.698391 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a49ee43_f7d8_41b0_b8e6_273f3acdba79.slice/crio-07759c94f73fa1673107be3eea5ff375bd0fa55efb98e4fdbb346688fa3d43e0 WatchSource:0}: Error finding container 07759c94f73fa1673107be3eea5ff375bd0fa55efb98e4fdbb346688fa3d43e0: Status 404 returned error can't find the container with id 07759c94f73fa1673107be3eea5ff375bd0fa55efb98e4fdbb346688fa3d43e0 Dec 01 11:42:05 crc kubenswrapper[4958]: I1201 11:42:05.812283 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5fd96ac-8e63-438e-8f66-8435513697c9" path="/var/lib/kubelet/pods/f5fd96ac-8e63-438e-8f66-8435513697c9/volumes" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.396257 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.413518 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-config-data\") pod \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.413626 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-logs\") pod \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.413682 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-combined-ca-bundle\") pod \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.413776 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrsf6\" (UniqueName: \"kubernetes.io/projected/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-kube-api-access-vrsf6\") pod \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\" (UID: \"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.414395 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-logs" (OuterVolumeSpecName: "logs") pod "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" (UID: "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.425100 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-kube-api-access-vrsf6" (OuterVolumeSpecName: "kube-api-access-vrsf6") pod "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" (UID: "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd"). InnerVolumeSpecName "kube-api-access-vrsf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.470164 4958 generic.go:334] "Generic (PLEG): container finished" podID="88a7ef3f-1cd5-4e71-b574-a52956739b72" containerID="574c005c50fa3d85420d1633534ce487a7593d9b086f3ddcf98dce630755c61d" exitCode=0 Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.470279 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"88a7ef3f-1cd5-4e71-b574-a52956739b72","Type":"ContainerDied","Data":"574c005c50fa3d85420d1633534ce487a7593d9b086f3ddcf98dce630755c61d"} Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.474580 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8a49ee43-f7d8-41b0-b8e6-273f3acdba79","Type":"ContainerStarted","Data":"6111a31f4224f5ccb5a59e29b8f8055f93af8879ac3cecb294a731b0748bba3f"} Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.474644 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8a49ee43-f7d8-41b0-b8e6-273f3acdba79","Type":"ContainerStarted","Data":"07759c94f73fa1673107be3eea5ff375bd0fa55efb98e4fdbb346688fa3d43e0"} Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.488016 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-config-data" (OuterVolumeSpecName: "config-data") pod "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" (UID: "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.490619 4958 generic.go:334] "Generic (PLEG): container finished" podID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerID="7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a" exitCode=0 Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.490694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd","Type":"ContainerDied","Data":"7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a"} Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.490699 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.490736 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b25b61a8-00a3-4d87-a5a6-9e1824ec58cd","Type":"ContainerDied","Data":"3c9e4515bc24531058efa64351adf380efead8dd322192c7cc411b982ae22934"} Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.490929 4958 scope.go:117] "RemoveContainer" containerID="7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.498898 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.498875446 podStartE2EDuration="2.498875446s" podCreationTimestamp="2025-12-01 11:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:06.495176181 +0000 UTC m=+6174.003965218" watchObservedRunningTime="2025-12-01 11:42:06.498875446 +0000 UTC m=+6174.007664483" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.500286 4958 generic.go:334] "Generic (PLEG): container finished" podID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerID="a4f04ca1d5ddee31d8f3725f7f1ca800825217c26ce9021d82223ec233bc0491" exitCode=0 Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.500333 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c5b597e-d210-46e3-8115-63c9e1a0d3f8","Type":"ContainerDied","Data":"a4f04ca1d5ddee31d8f3725f7f1ca800825217c26ce9021d82223ec233bc0491"} Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.523163 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.523212 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrsf6\" (UniqueName: \"kubernetes.io/projected/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-kube-api-access-vrsf6\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.523227 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.530262 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" (UID: "b25b61a8-00a3-4d87-a5a6-9e1824ec58cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.545636 4958 scope.go:117] "RemoveContainer" containerID="842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.586740 4958 scope.go:117] "RemoveContainer" containerID="7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a" Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.590556 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a\": container with ID starting with 7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a not found: ID does not exist" containerID="7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.590591 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a"} err="failed to get container status \"7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a\": rpc error: code = NotFound desc = could not find container \"7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a\": container with ID starting with 7f14125726b8ca99cf292abcd74f599978723763ecfc74232135eb55c773d30a not found: ID does not exist" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.590614 4958 scope.go:117] "RemoveContainer" containerID="842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a" Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.596377 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a\": container with ID starting with 842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a not found: ID does not exist" containerID="842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.596423 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a"} err="failed to get container status \"842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a\": rpc error: code = NotFound desc = could not find container \"842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a\": container with ID starting with 842283e5e5f49bb2ac2f33f476cdbce5bc7fb7997b589ef7a0b73666959a232a not found: ID does not exist" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.596609 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.624786 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.669082 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.726386 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-combined-ca-bundle\") pod \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.726520 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-logs\") pod \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.726635 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shl8\" (UniqueName: \"kubernetes.io/projected/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-kube-api-access-8shl8\") pod \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.726699 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-config-data\") pod \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\" (UID: \"8c5b597e-d210-46e3-8115-63c9e1a0d3f8\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.728286 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-logs" (OuterVolumeSpecName: "logs") pod "8c5b597e-d210-46e3-8115-63c9e1a0d3f8" (UID: "8c5b597e-d210-46e3-8115-63c9e1a0d3f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.750400 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-kube-api-access-8shl8" (OuterVolumeSpecName: "kube-api-access-8shl8") pod "8c5b597e-d210-46e3-8115-63c9e1a0d3f8" (UID: "8c5b597e-d210-46e3-8115-63c9e1a0d3f8"). InnerVolumeSpecName "kube-api-access-8shl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.758899 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c5b597e-d210-46e3-8115-63c9e1a0d3f8" (UID: "8c5b597e-d210-46e3-8115-63c9e1a0d3f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.764251 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-config-data" (OuterVolumeSpecName: "config-data") pod "8c5b597e-d210-46e3-8115-63c9e1a0d3f8" (UID: "8c5b597e-d210-46e3-8115-63c9e1a0d3f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.828742 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-combined-ca-bundle\") pod \"88a7ef3f-1cd5-4e71-b574-a52956739b72\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.828877 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-config-data\") pod \"88a7ef3f-1cd5-4e71-b574-a52956739b72\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.828972 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jxk\" (UniqueName: \"kubernetes.io/projected/88a7ef3f-1cd5-4e71-b574-a52956739b72-kube-api-access-47jxk\") pod \"88a7ef3f-1cd5-4e71-b574-a52956739b72\" (UID: \"88a7ef3f-1cd5-4e71-b574-a52956739b72\") " Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.829336 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.829358 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.829370 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.829379 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shl8\" (UniqueName: \"kubernetes.io/projected/8c5b597e-d210-46e3-8115-63c9e1a0d3f8-kube-api-access-8shl8\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.838045 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a7ef3f-1cd5-4e71-b574-a52956739b72-kube-api-access-47jxk" (OuterVolumeSpecName: "kube-api-access-47jxk") pod "88a7ef3f-1cd5-4e71-b574-a52956739b72" (UID: "88a7ef3f-1cd5-4e71-b574-a52956739b72"). InnerVolumeSpecName "kube-api-access-47jxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.850952 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.859701 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-config-data" (OuterVolumeSpecName: "config-data") pod "88a7ef3f-1cd5-4e71-b574-a52956739b72" (UID: "88a7ef3f-1cd5-4e71-b574-a52956739b72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.898524 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.923746 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.924215 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a7ef3f-1cd5-4e71-b574-a52956739b72" containerName="nova-scheduler-scheduler" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924235 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a7ef3f-1cd5-4e71-b574-a52956739b72" containerName="nova-scheduler-scheduler" Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.924255 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-log" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924263 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-log" Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.924284 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-metadata" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924291 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-metadata" Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.924304 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-log" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924309 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-log" Dec 01 11:42:06 crc kubenswrapper[4958]: E1201 11:42:06.924321 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-api" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924327 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-api" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924543 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-api" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924560 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-metadata" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924570 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a7ef3f-1cd5-4e71-b574-a52956739b72" containerName="nova-scheduler-scheduler" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924592 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" containerName="nova-api-log" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924598 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" containerName="nova-metadata-log" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.924618 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a7ef3f-1cd5-4e71-b574-a52956739b72" (UID: "88a7ef3f-1cd5-4e71-b574-a52956739b72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.925707 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.929115 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.930512 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.930545 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a7ef3f-1cd5-4e71-b574-a52956739b72-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.930554 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jxk\" (UniqueName: \"kubernetes.io/projected/88a7ef3f-1cd5-4e71-b574-a52956739b72-kube-api-access-47jxk\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:06 crc kubenswrapper[4958]: I1201 11:42:06.948246 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.032568 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1fa21c-37dc-47a2-82f3-b68e89651d04-logs\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.032633 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.032714 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnmc\" (UniqueName: \"kubernetes.io/projected/9c1fa21c-37dc-47a2-82f3-b68e89651d04-kube-api-access-nqnmc\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.032825 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-config-data\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.134761 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-config-data\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.135156 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1fa21c-37dc-47a2-82f3-b68e89651d04-logs\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.135189 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.135268 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnmc\" (UniqueName: \"kubernetes.io/projected/9c1fa21c-37dc-47a2-82f3-b68e89651d04-kube-api-access-nqnmc\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.137807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1fa21c-37dc-47a2-82f3-b68e89651d04-logs\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.139250 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-config-data\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.140029 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.163191 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnmc\" (UniqueName: \"kubernetes.io/projected/9c1fa21c-37dc-47a2-82f3-b68e89651d04-kube-api-access-nqnmc\") pod \"nova-api-0\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.257165 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 11:42:07 crc kubenswrapper[4958]: E1201 11:42:07.294167 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 11:42:07 crc kubenswrapper[4958]: E1201 11:42:07.296952 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 11:42:07 crc kubenswrapper[4958]: E1201 11:42:07.299204 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 11:42:07 crc kubenswrapper[4958]: E1201 11:42:07.299292 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="515ccd90-5b0b-43a2-a599-992fa27dd517" containerName="nova-cell1-conductor-conductor" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.552558 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c5b597e-d210-46e3-8115-63c9e1a0d3f8","Type":"ContainerDied","Data":"9bfb0f233f2702622e8a70624e26420f78190dffcac1bbefaaeff573ab71f078"} Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.552787 4958 scope.go:117] "RemoveContainer" containerID="a4f04ca1d5ddee31d8f3725f7f1ca800825217c26ce9021d82223ec233bc0491" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.552922 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.577779 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"88a7ef3f-1cd5-4e71-b574-a52956739b72","Type":"ContainerDied","Data":"54fea45e70c00875ed28f4106a361d015482ae0420bd432c73890e787b998e42"} Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.577897 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.599033 4958 generic.go:334] "Generic (PLEG): container finished" podID="e20bdad3-ef47-4629-8812-d4c6f7345279" containerID="9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91" exitCode=0 Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.599097 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e20bdad3-ef47-4629-8812-d4c6f7345279","Type":"ContainerDied","Data":"9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91"} Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.607366 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.648217 4958 scope.go:117] "RemoveContainer" containerID="54f6cb4420ff9a9bd0758ea58648caec67bf1c96b04f154404936cf4141a0324" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.754403 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.754544 4958 scope.go:117] "RemoveContainer" containerID="574c005c50fa3d85420d1633534ce487a7593d9b086f3ddcf98dce630755c61d" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.755587 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-combined-ca-bundle\") pod \"e20bdad3-ef47-4629-8812-d4c6f7345279\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.755643 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsjsr\" (UniqueName: \"kubernetes.io/projected/e20bdad3-ef47-4629-8812-d4c6f7345279-kube-api-access-zsjsr\") pod \"e20bdad3-ef47-4629-8812-d4c6f7345279\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.755854 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-config-data\") pod \"e20bdad3-ef47-4629-8812-d4c6f7345279\" (UID: \"e20bdad3-ef47-4629-8812-d4c6f7345279\") " Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.789049 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20bdad3-ef47-4629-8812-d4c6f7345279-kube-api-access-zsjsr" (OuterVolumeSpecName: "kube-api-access-zsjsr") pod "e20bdad3-ef47-4629-8812-d4c6f7345279" (UID: "e20bdad3-ef47-4629-8812-d4c6f7345279"). InnerVolumeSpecName "kube-api-access-zsjsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.790867 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.835009 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-config-data" (OuterVolumeSpecName: "config-data") pod "e20bdad3-ef47-4629-8812-d4c6f7345279" (UID: "e20bdad3-ef47-4629-8812-d4c6f7345279"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.835837 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5b597e-d210-46e3-8115-63c9e1a0d3f8" path="/var/lib/kubelet/pods/8c5b597e-d210-46e3-8115-63c9e1a0d3f8/volumes" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.836497 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25b61a8-00a3-4d87-a5a6-9e1824ec58cd" path="/var/lib/kubelet/pods/b25b61a8-00a3-4d87-a5a6-9e1824ec58cd/volumes" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.841050 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e20bdad3-ef47-4629-8812-d4c6f7345279" (UID: "e20bdad3-ef47-4629-8812-d4c6f7345279"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.861103 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.861133 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20bdad3-ef47-4629-8812-d4c6f7345279-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.861145 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsjsr\" (UniqueName: \"kubernetes.io/projected/e20bdad3-ef47-4629-8812-d4c6f7345279-kube-api-access-zsjsr\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.917360 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.917416 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.933686 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: E1201 11:42:07.934416 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20bdad3-ef47-4629-8812-d4c6f7345279" containerName="nova-cell0-conductor-conductor" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.934439 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20bdad3-ef47-4629-8812-d4c6f7345279" containerName="nova-cell0-conductor-conductor" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.934758 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20bdad3-ef47-4629-8812-d4c6f7345279" containerName="nova-cell0-conductor-conductor" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.937743 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.940557 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.949074 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.963487 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac026d41-87e6-44c4-8597-c6fa860ba9e5-logs\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.963552 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.963577 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-config-data\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.963698 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmdmk\" (UniqueName: \"kubernetes.io/projected/ac026d41-87e6-44c4-8597-c6fa860ba9e5-kube-api-access-jmdmk\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.968521 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.969773 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.971858 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.976570 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:42:07 crc kubenswrapper[4958]: I1201 11:42:07.985480 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064385 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064714 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-config-data\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064749 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq77f\" (UniqueName: \"kubernetes.io/projected/9db9b6d2-f515-4f0e-83b6-d56ee091734f-kube-api-access-bq77f\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064777 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064795 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-config-data\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064892 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmdmk\" (UniqueName: \"kubernetes.io/projected/ac026d41-87e6-44c4-8597-c6fa860ba9e5-kube-api-access-jmdmk\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.064935 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac026d41-87e6-44c4-8597-c6fa860ba9e5-logs\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.066583 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac026d41-87e6-44c4-8597-c6fa860ba9e5-logs\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.070683 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.071614 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-config-data\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.088649 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmdmk\" (UniqueName: \"kubernetes.io/projected/ac026d41-87e6-44c4-8597-c6fa860ba9e5-kube-api-access-jmdmk\") pod \"nova-metadata-0\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.166129 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq77f\" (UniqueName: \"kubernetes.io/projected/9db9b6d2-f515-4f0e-83b6-d56ee091734f-kube-api-access-bq77f\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.166183 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.166214 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-config-data\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.170675 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-config-data\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.172228 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.185972 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq77f\" (UniqueName: \"kubernetes.io/projected/9db9b6d2-f515-4f0e-83b6-d56ee091734f-kube-api-access-bq77f\") pod \"nova-scheduler-0\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.313922 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.351173 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.650100 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c1fa21c-37dc-47a2-82f3-b68e89651d04","Type":"ContainerStarted","Data":"4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3"} Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.650366 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c1fa21c-37dc-47a2-82f3-b68e89651d04","Type":"ContainerStarted","Data":"731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a"} Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.650377 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c1fa21c-37dc-47a2-82f3-b68e89651d04","Type":"ContainerStarted","Data":"abfe2b9a78e77af371aefa1e69c1ffd1d964acd44d51154e4eb9102554c4557f"} Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.658345 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.662348 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e20bdad3-ef47-4629-8812-d4c6f7345279","Type":"ContainerDied","Data":"3cf5c54a2baf442ba63f2a6cfe0f27e05dcb1ed809cd0aa401c54454d1c76a68"} Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.662404 4958 scope.go:117] "RemoveContainer" containerID="9a352cd761547656d6731f89ba6379abaaa9c5ab39d60fcc184689cb07ba9a91" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.673248 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6732260009999997 podStartE2EDuration="2.673226001s" podCreationTimestamp="2025-12-01 11:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:08.667044025 +0000 UTC m=+6176.175833062" watchObservedRunningTime="2025-12-01 11:42:08.673226001 +0000 UTC m=+6176.182015038" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.715645 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.730949 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.749121 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.750698 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.753396 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.761084 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.817552 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: W1201 11:42:08.821498 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9db9b6d2_f515_4f0e_83b6_d56ee091734f.slice/crio-1d7a9095126bbcc0f1a74be4cb6c78d610983a19f7e319dd4ee83432842ba2f3 WatchSource:0}: Error finding container 1d7a9095126bbcc0f1a74be4cb6c78d610983a19f7e319dd4ee83432842ba2f3: Status 404 returned error can't find the container with id 1d7a9095126bbcc0f1a74be4cb6c78d610983a19f7e319dd4ee83432842ba2f3 Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.825563 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.878287 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.878366 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.878421 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjk29\" (UniqueName: \"kubernetes.io/projected/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-kube-api-access-pjk29\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.997060 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.997135 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:08 crc kubenswrapper[4958]: I1201 11:42:08.997189 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjk29\" (UniqueName: \"kubernetes.io/projected/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-kube-api-access-pjk29\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.001342 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.013593 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.030739 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjk29\" (UniqueName: \"kubernetes.io/projected/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-kube-api-access-pjk29\") pod \"nova-cell0-conductor-0\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.076274 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:09 crc kubenswrapper[4958]: W1201 11:42:09.619682 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcfdd8c8_b0e9_44ef_92b7_4968f37f1596.slice/crio-17e9c98cdbee80cebdac9656339b9544e746246b1fc759748f53184e00f1201c WatchSource:0}: Error finding container 17e9c98cdbee80cebdac9656339b9544e746246b1fc759748f53184e00f1201c: Status 404 returned error can't find the container with id 17e9c98cdbee80cebdac9656339b9544e746246b1fc759748f53184e00f1201c Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.622236 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.688141 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9db9b6d2-f515-4f0e-83b6-d56ee091734f","Type":"ContainerStarted","Data":"79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed"} Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.688490 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9db9b6d2-f515-4f0e-83b6-d56ee091734f","Type":"ContainerStarted","Data":"1d7a9095126bbcc0f1a74be4cb6c78d610983a19f7e319dd4ee83432842ba2f3"} Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.690597 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac026d41-87e6-44c4-8597-c6fa860ba9e5","Type":"ContainerStarted","Data":"efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e"} Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.690650 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac026d41-87e6-44c4-8597-c6fa860ba9e5","Type":"ContainerStarted","Data":"45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0"} Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.690664 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac026d41-87e6-44c4-8597-c6fa860ba9e5","Type":"ContainerStarted","Data":"fa6cd2dcbc00c73af19ca2beb1abea78fa9f747a9091126a81988dfca31631ab"} Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.701165 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596","Type":"ContainerStarted","Data":"17e9c98cdbee80cebdac9656339b9544e746246b1fc759748f53184e00f1201c"} Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.708340 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.708324024 podStartE2EDuration="2.708324024s" podCreationTimestamp="2025-12-01 11:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:09.703274721 +0000 UTC m=+6177.212063758" watchObservedRunningTime="2025-12-01 11:42:09.708324024 +0000 UTC m=+6177.217113061" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.742023 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.74200156 podStartE2EDuration="2.74200156s" podCreationTimestamp="2025-12-01 11:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:09.726213042 +0000 UTC m=+6177.235002099" watchObservedRunningTime="2025-12-01 11:42:09.74200156 +0000 UTC m=+6177.250790587" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.807742 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a7ef3f-1cd5-4e71-b574-a52956739b72" path="/var/lib/kubelet/pods/88a7ef3f-1cd5-4e71-b574-a52956739b72/volumes" Dec 01 11:42:09 crc kubenswrapper[4958]: I1201 11:42:09.808507 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20bdad3-ef47-4629-8812-d4c6f7345279" path="/var/lib/kubelet/pods/e20bdad3-ef47-4629-8812-d4c6f7345279/volumes" Dec 01 11:42:10 crc kubenswrapper[4958]: I1201 11:42:10.154266 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:10 crc kubenswrapper[4958]: I1201 11:42:10.723715 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596","Type":"ContainerStarted","Data":"3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc"} Dec 01 11:42:10 crc kubenswrapper[4958]: I1201 11:42:10.724154 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:10 crc kubenswrapper[4958]: I1201 11:42:10.755158 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.755131931 podStartE2EDuration="2.755131931s" podCreationTimestamp="2025-12-01 11:42:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:10.743811019 +0000 UTC m=+6178.252600056" watchObservedRunningTime="2025-12-01 11:42:10.755131931 +0000 UTC m=+6178.263920988" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.288803 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.488674 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-combined-ca-bundle\") pod \"515ccd90-5b0b-43a2-a599-992fa27dd517\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.488729 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mw9z\" (UniqueName: \"kubernetes.io/projected/515ccd90-5b0b-43a2-a599-992fa27dd517-kube-api-access-9mw9z\") pod \"515ccd90-5b0b-43a2-a599-992fa27dd517\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.488860 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-config-data\") pod \"515ccd90-5b0b-43a2-a599-992fa27dd517\" (UID: \"515ccd90-5b0b-43a2-a599-992fa27dd517\") " Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.496185 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515ccd90-5b0b-43a2-a599-992fa27dd517-kube-api-access-9mw9z" (OuterVolumeSpecName: "kube-api-access-9mw9z") pod "515ccd90-5b0b-43a2-a599-992fa27dd517" (UID: "515ccd90-5b0b-43a2-a599-992fa27dd517"). InnerVolumeSpecName "kube-api-access-9mw9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.521019 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "515ccd90-5b0b-43a2-a599-992fa27dd517" (UID: "515ccd90-5b0b-43a2-a599-992fa27dd517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.526275 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-config-data" (OuterVolumeSpecName: "config-data") pod "515ccd90-5b0b-43a2-a599-992fa27dd517" (UID: "515ccd90-5b0b-43a2-a599-992fa27dd517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.590727 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.590775 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mw9z\" (UniqueName: \"kubernetes.io/projected/515ccd90-5b0b-43a2-a599-992fa27dd517-kube-api-access-9mw9z\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.590790 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ccd90-5b0b-43a2-a599-992fa27dd517-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.746695 4958 generic.go:334] "Generic (PLEG): container finished" podID="515ccd90-5b0b-43a2-a599-992fa27dd517" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" exitCode=0 Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.746755 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"515ccd90-5b0b-43a2-a599-992fa27dd517","Type":"ContainerDied","Data":"216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5"} Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.746838 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"515ccd90-5b0b-43a2-a599-992fa27dd517","Type":"ContainerDied","Data":"5f48329c3f61fd98aabcafabe1935cec448256ccd9263ec7bb48a724e33ecb17"} Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.746898 4958 scope.go:117] "RemoveContainer" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.746831 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.807945 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.821179 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.822131 4958 scope.go:117] "RemoveContainer" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" Dec 01 11:42:12 crc kubenswrapper[4958]: E1201 11:42:12.825385 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5\": container with ID starting with 216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5 not found: ID does not exist" containerID="216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.825453 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5"} err="failed to get container status \"216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5\": rpc error: code = NotFound desc = could not find container \"216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5\": container with ID starting with 216ff4695a779d2e511563f338216e8e2c7a5f140ae275dc3a4b40f2d0c556a5 not found: ID does not exist" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.854994 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:42:12 crc kubenswrapper[4958]: E1201 11:42:12.855597 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515ccd90-5b0b-43a2-a599-992fa27dd517" containerName="nova-cell1-conductor-conductor" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.855621 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="515ccd90-5b0b-43a2-a599-992fa27dd517" containerName="nova-cell1-conductor-conductor" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.855902 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="515ccd90-5b0b-43a2-a599-992fa27dd517" containerName="nova-cell1-conductor-conductor" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.856722 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.860410 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.871904 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.902792 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.902918 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxzc4\" (UniqueName: \"kubernetes.io/projected/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-kube-api-access-lxzc4\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:12 crc kubenswrapper[4958]: I1201 11:42:12.902964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.004881 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.004951 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxzc4\" (UniqueName: \"kubernetes.io/projected/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-kube-api-access-lxzc4\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.004988 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.011807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.012186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.024162 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxzc4\" (UniqueName: \"kubernetes.io/projected/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-kube-api-access-lxzc4\") pod \"nova-cell1-conductor-0\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.214284 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.314761 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.314874 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.351857 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 11:42:13 crc kubenswrapper[4958]: W1201 11:42:13.740788 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d20bd7_d045_43e1_a954_a7ef8d11b3d7.slice/crio-a572204080969969f9ea29bbd103e34181aeded0137f718627735ccc51093619 WatchSource:0}: Error finding container a572204080969969f9ea29bbd103e34181aeded0137f718627735ccc51093619: Status 404 returned error can't find the container with id a572204080969969f9ea29bbd103e34181aeded0137f718627735ccc51093619 Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.747819 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.760046 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01d20bd7-d045-43e1-a954-a7ef8d11b3d7","Type":"ContainerStarted","Data":"a572204080969969f9ea29bbd103e34181aeded0137f718627735ccc51093619"} Dec 01 11:42:13 crc kubenswrapper[4958]: I1201 11:42:13.816778 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515ccd90-5b0b-43a2-a599-992fa27dd517" path="/var/lib/kubelet/pods/515ccd90-5b0b-43a2-a599-992fa27dd517/volumes" Dec 01 11:42:14 crc kubenswrapper[4958]: I1201 11:42:14.115369 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 11:42:14 crc kubenswrapper[4958]: I1201 11:42:14.773039 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01d20bd7-d045-43e1-a954-a7ef8d11b3d7","Type":"ContainerStarted","Data":"45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b"} Dec 01 11:42:14 crc kubenswrapper[4958]: I1201 11:42:14.773492 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:14 crc kubenswrapper[4958]: I1201 11:42:14.795880 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.7958366569999997 podStartE2EDuration="2.795836657s" podCreationTimestamp="2025-12-01 11:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:14.787718006 +0000 UTC m=+6182.296507053" watchObservedRunningTime="2025-12-01 11:42:14.795836657 +0000 UTC m=+6182.304625694" Dec 01 11:42:15 crc kubenswrapper[4958]: I1201 11:42:15.153293 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:15 crc kubenswrapper[4958]: I1201 11:42:15.170994 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:15 crc kubenswrapper[4958]: I1201 11:42:15.793459 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 11:42:17 crc kubenswrapper[4958]: I1201 11:42:17.257912 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 11:42:17 crc kubenswrapper[4958]: I1201 11:42:17.258267 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.248641 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.314805 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.314880 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.339150 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.339579 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.352087 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.394772 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 11:42:18 crc kubenswrapper[4958]: I1201 11:42:18.851314 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 11:42:19 crc kubenswrapper[4958]: I1201 11:42:19.398141 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:42:19 crc kubenswrapper[4958]: I1201 11:42:19.398174 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.618976 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.622055 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.628758 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.634632 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.770236 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-scripts\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.770313 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.770338 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.770404 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.770449 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f6a6f1b-c5d6-426e-bed7-78a47963612a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.770466 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr275\" (UniqueName: \"kubernetes.io/projected/0f6a6f1b-c5d6-426e-bed7-78a47963612a-kube-api-access-kr275\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.871648 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f6a6f1b-c5d6-426e-bed7-78a47963612a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.871725 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr275\" (UniqueName: \"kubernetes.io/projected/0f6a6f1b-c5d6-426e-bed7-78a47963612a-kube-api-access-kr275\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.871788 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-scripts\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.871864 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f6a6f1b-c5d6-426e-bed7-78a47963612a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.871897 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.872117 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.872408 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.878450 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.878490 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-scripts\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.879751 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.881555 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.907020 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr275\" (UniqueName: \"kubernetes.io/projected/0f6a6f1b-c5d6-426e-bed7-78a47963612a-kube-api-access-kr275\") pod \"cinder-scheduler-0\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:22 crc kubenswrapper[4958]: I1201 11:42:22.960966 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 11:42:23 crc kubenswrapper[4958]: I1201 11:42:23.465494 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:23 crc kubenswrapper[4958]: W1201 11:42:23.469143 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f6a6f1b_c5d6_426e_bed7_78a47963612a.slice/crio-2eec3019f75096a846063ee43124c3291ffc9791e9be1e116575d8a5cf9156d3 WatchSource:0}: Error finding container 2eec3019f75096a846063ee43124c3291ffc9791e9be1e116575d8a5cf9156d3: Status 404 returned error can't find the container with id 2eec3019f75096a846063ee43124c3291ffc9791e9be1e116575d8a5cf9156d3 Dec 01 11:42:23 crc kubenswrapper[4958]: I1201 11:42:23.877658 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f6a6f1b-c5d6-426e-bed7-78a47963612a","Type":"ContainerStarted","Data":"2eec3019f75096a846063ee43124c3291ffc9791e9be1e116575d8a5cf9156d3"} Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.525003 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.526065 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api" containerID="cri-o://3d5f046d24182d8829b81adb9591424bc44352883247f53fb1573a4de2b87699" gracePeriod=30 Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.525994 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api-log" containerID="cri-o://91a9f05c28b8c841cd918ba669a942a507ad7cd5f755e3e0a916d9ae82816934" gracePeriod=30 Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.893713 4958 generic.go:334] "Generic (PLEG): container finished" podID="7607cce4-9386-41b0-aa60-4993250abc80" containerID="91a9f05c28b8c841cd918ba669a942a507ad7cd5f755e3e0a916d9ae82816934" exitCode=143 Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.893756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7607cce4-9386-41b0-aa60-4993250abc80","Type":"ContainerDied","Data":"91a9f05c28b8c841cd918ba669a942a507ad7cd5f755e3e0a916d9ae82816934"} Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.895147 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f6a6f1b-c5d6-426e-bed7-78a47963612a","Type":"ContainerStarted","Data":"d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594"} Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.936752 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.938392 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.940975 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.963015 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.982173 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.982759 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.982910 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.982993 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983067 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/98090413-9736-47f9-9192-06174c47c3b0-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983147 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-sys\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983227 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983304 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983401 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-dev\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983479 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-run\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983593 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983720 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grc2\" (UniqueName: \"kubernetes.io/projected/98090413-9736-47f9-9192-06174c47c3b0-kube-api-access-6grc2\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983878 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.983958 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.984116 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:24 crc kubenswrapper[4958]: I1201 11:42:24.984241 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.086927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087277 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087460 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087514 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087610 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087614 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/98090413-9736-47f9-9192-06174c47c3b0-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087784 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-sys\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088009 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088089 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-dev\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088252 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-run\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088318 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-run\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.087922 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-sys\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088407 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088282 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-dev\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088517 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088685 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grc2\" (UniqueName: \"kubernetes.io/projected/98090413-9736-47f9-9192-06174c47c3b0-kube-api-access-6grc2\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088715 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088923 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.088921 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.089136 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.089303 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.089456 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.089563 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/98090413-9736-47f9-9192-06174c47c3b0-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.092536 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.094136 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.098176 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.105667 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/98090413-9736-47f9-9192-06174c47c3b0-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.110023 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98090413-9736-47f9-9192-06174c47c3b0-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.171635 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grc2\" (UniqueName: \"kubernetes.io/projected/98090413-9736-47f9-9192-06174c47c3b0-kube-api-access-6grc2\") pod \"cinder-volume-volume1-0\" (UID: \"98090413-9736-47f9-9192-06174c47c3b0\") " pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.281507 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.741570 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.744261 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.746901 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.754283 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801672 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801729 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-lib-modules\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801755 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801784 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-sys\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801817 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801876 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801901 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-dev\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.801991 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802181 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-config-data\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802253 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802361 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9f555a33-3448-4f8a-be75-3bbfc99cbf02-ceph\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802425 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-scripts\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802531 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-run\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802569 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zl85\" (UniqueName: \"kubernetes.io/projected/9f555a33-3448-4f8a-be75-3bbfc99cbf02-kube-api-access-6zl85\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.802596 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-scripts\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903336 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-run\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903392 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zl85\" (UniqueName: \"kubernetes.io/projected/9f555a33-3448-4f8a-be75-3bbfc99cbf02-kube-api-access-6zl85\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903482 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903485 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-run\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903550 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-lib-modules\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903508 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-lib-modules\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903599 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903630 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-sys\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903662 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903713 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903737 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903825 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-dev\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904173 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904207 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-config-data\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904235 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904278 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9f555a33-3448-4f8a-be75-3bbfc99cbf02-ceph\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.903972 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-dev\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904656 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-nvme\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904677 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904691 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-sys\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904717 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.904742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.905072 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.905122 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f555a33-3448-4f8a-be75-3bbfc99cbf02-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.909458 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9f555a33-3448-4f8a-be75-3bbfc99cbf02-ceph\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.909629 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-config-data-custom\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.911411 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-scripts\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.911459 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.913149 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f6a6f1b-c5d6-426e-bed7-78a47963612a","Type":"ContainerStarted","Data":"8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07"} Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.929993 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f555a33-3448-4f8a-be75-3bbfc99cbf02-config-data\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.932768 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zl85\" (UniqueName: \"kubernetes.io/projected/9f555a33-3448-4f8a-be75-3bbfc99cbf02-kube-api-access-6zl85\") pod \"cinder-backup-0\" (UID: \"9f555a33-3448-4f8a-be75-3bbfc99cbf02\") " pod="openstack/cinder-backup-0" Dec 01 11:42:25 crc kubenswrapper[4958]: I1201 11:42:25.942772 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 11:42:25 crc kubenswrapper[4958]: W1201 11:42:25.946501 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98090413_9736_47f9_9192_06174c47c3b0.slice/crio-c531ea64c89a7eb28da8962e6ba043e9ee4bcc1da431a9a5236663d762fafbb6 WatchSource:0}: Error finding container c531ea64c89a7eb28da8962e6ba043e9ee4bcc1da431a9a5236663d762fafbb6: Status 404 returned error can't find the container with id c531ea64c89a7eb28da8962e6ba043e9ee4bcc1da431a9a5236663d762fafbb6 Dec 01 11:42:26 crc kubenswrapper[4958]: I1201 11:42:26.074480 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 01 11:42:26 crc kubenswrapper[4958]: I1201 11:42:26.881983 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.881960221 podStartE2EDuration="4.881960221s" podCreationTimestamp="2025-12-01 11:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:25.960332369 +0000 UTC m=+6193.469121406" watchObservedRunningTime="2025-12-01 11:42:26.881960221 +0000 UTC m=+6194.390749278" Dec 01 11:42:26 crc kubenswrapper[4958]: I1201 11:42:26.886535 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 01 11:42:26 crc kubenswrapper[4958]: I1201 11:42:26.924675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"98090413-9736-47f9-9192-06174c47c3b0","Type":"ContainerStarted","Data":"c531ea64c89a7eb28da8962e6ba043e9ee4bcc1da431a9a5236663d762fafbb6"} Dec 01 11:42:26 crc kubenswrapper[4958]: W1201 11:42:26.964227 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f555a33_3448_4f8a_be75_3bbfc99cbf02.slice/crio-6e43827b7d0791e5f22bfd3faad3f6427eb7eac47a70b4a5b66a3662bc48230e WatchSource:0}: Error finding container 6e43827b7d0791e5f22bfd3faad3f6427eb7eac47a70b4a5b66a3662bc48230e: Status 404 returned error can't find the container with id 6e43827b7d0791e5f22bfd3faad3f6427eb7eac47a70b4a5b66a3662bc48230e Dec 01 11:42:27 crc kubenswrapper[4958]: I1201 11:42:27.261868 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 11:42:27 crc kubenswrapper[4958]: I1201 11:42:27.262836 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 11:42:27 crc kubenswrapper[4958]: I1201 11:42:27.263307 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 11:42:27 crc kubenswrapper[4958]: I1201 11:42:27.281122 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 11:42:27 crc kubenswrapper[4958]: I1201 11:42:27.720813 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.79:8776/healthcheck\": read tcp 10.217.0.2:59984->10.217.1.79:8776: read: connection reset by peer" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.007169 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.025621 4958 generic.go:334] "Generic (PLEG): container finished" podID="7607cce4-9386-41b0-aa60-4993250abc80" containerID="3d5f046d24182d8829b81adb9591424bc44352883247f53fb1573a4de2b87699" exitCode=0 Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.025704 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7607cce4-9386-41b0-aa60-4993250abc80","Type":"ContainerDied","Data":"3d5f046d24182d8829b81adb9591424bc44352883247f53fb1573a4de2b87699"} Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.037217 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"98090413-9736-47f9-9192-06174c47c3b0","Type":"ContainerStarted","Data":"a0da4411ef8fb89985fa3e763b0cdb41b4af9b610e5f3f0f23c80e4b86c90859"} Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.037264 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"98090413-9736-47f9-9192-06174c47c3b0","Type":"ContainerStarted","Data":"db274b5720e031375b5511c93fa20ddb5df77073120ae17d43f1432c4aea775c"} Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.038951 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9f555a33-3448-4f8a-be75-3bbfc99cbf02","Type":"ContainerStarted","Data":"6e43827b7d0791e5f22bfd3faad3f6427eb7eac47a70b4a5b66a3662bc48230e"} Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.039439 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.053487 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.126800 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.072191452 podStartE2EDuration="4.126772999s" podCreationTimestamp="2025-12-01 11:42:24 +0000 UTC" firstStartedPulling="2025-12-01 11:42:25.949198433 +0000 UTC m=+6193.457987470" lastFinishedPulling="2025-12-01 11:42:27.00377998 +0000 UTC m=+6194.512569017" observedRunningTime="2025-12-01 11:42:28.062172865 +0000 UTC m=+6195.570961912" watchObservedRunningTime="2025-12-01 11:42:28.126772999 +0000 UTC m=+6195.635562026" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.211273 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.211338 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.331211 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.335689 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.340364 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.389328 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536019 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7607cce4-9386-41b0-aa60-4993250abc80-etc-machine-id\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536097 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-scripts\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536130 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7607cce4-9386-41b0-aa60-4993250abc80-logs\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536142 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7607cce4-9386-41b0-aa60-4993250abc80-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536219 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-combined-ca-bundle\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536283 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data-custom\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536311 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjh9v\" (UniqueName: \"kubernetes.io/projected/7607cce4-9386-41b0-aa60-4993250abc80-kube-api-access-tjh9v\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536350 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data\") pod \"7607cce4-9386-41b0-aa60-4993250abc80\" (UID: \"7607cce4-9386-41b0-aa60-4993250abc80\") " Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.536742 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7607cce4-9386-41b0-aa60-4993250abc80-logs" (OuterVolumeSpecName: "logs") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.537248 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7607cce4-9386-41b0-aa60-4993250abc80-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.537267 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7607cce4-9386-41b0-aa60-4993250abc80-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.542935 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.543137 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7607cce4-9386-41b0-aa60-4993250abc80-kube-api-access-tjh9v" (OuterVolumeSpecName: "kube-api-access-tjh9v") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "kube-api-access-tjh9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.545124 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-scripts" (OuterVolumeSpecName: "scripts") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.588381 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.606078 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data" (OuterVolumeSpecName: "config-data") pod "7607cce4-9386-41b0-aa60-4993250abc80" (UID: "7607cce4-9386-41b0-aa60-4993250abc80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.653775 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.654130 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.654146 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.654159 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjh9v\" (UniqueName: \"kubernetes.io/projected/7607cce4-9386-41b0-aa60-4993250abc80-kube-api-access-tjh9v\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:28 crc kubenswrapper[4958]: I1201 11:42:28.654173 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607cce4-9386-41b0-aa60-4993250abc80-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.049173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7607cce4-9386-41b0-aa60-4993250abc80","Type":"ContainerDied","Data":"abe13bff091b3e72f4f69bc38e4b67e3cf984e66376d2929de02ce351a8e915b"} Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.049271 4958 scope.go:117] "RemoveContainer" containerID="3d5f046d24182d8829b81adb9591424bc44352883247f53fb1573a4de2b87699" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.049442 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.070607 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9f555a33-3448-4f8a-be75-3bbfc99cbf02","Type":"ContainerStarted","Data":"1e3b61255b790e3e3df99784c1a82fd2d9d1d31a5fc7c34d6a323b5a1e83a4c3"} Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.070660 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"9f555a33-3448-4f8a-be75-3bbfc99cbf02","Type":"ContainerStarted","Data":"d61ef782becbe59c05d13baf6fe0a360a99bb8f4003d307426e41e65e6263f05"} Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.122695 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.793886398 podStartE2EDuration="4.12265477s" podCreationTimestamp="2025-12-01 11:42:25 +0000 UTC" firstStartedPulling="2025-12-01 11:42:26.967703425 +0000 UTC m=+6194.476492462" lastFinishedPulling="2025-12-01 11:42:28.296471797 +0000 UTC m=+6195.805260834" observedRunningTime="2025-12-01 11:42:29.109934829 +0000 UTC m=+6196.618723866" watchObservedRunningTime="2025-12-01 11:42:29.12265477 +0000 UTC m=+6196.631443807" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.147151 4958 scope.go:117] "RemoveContainer" containerID="91a9f05c28b8c841cd918ba669a942a507ad7cd5f755e3e0a916d9ae82816934" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.163202 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.211165 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.258958 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:42:29 crc kubenswrapper[4958]: E1201 11:42:29.259423 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.259439 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api" Dec 01 11:42:29 crc kubenswrapper[4958]: E1201 11:42:29.259456 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api-log" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.259464 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api-log" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.259645 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api-log" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.259669 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7607cce4-9386-41b0-aa60-4993250abc80" containerName="cinder-api" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.260701 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.272059 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.281769 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.284278 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286375 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-scripts\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286478 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57707c75-4120-483e-bd16-ef587392f6b7-logs\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286526 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-config-data\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286580 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrjvl\" (UniqueName: \"kubernetes.io/projected/57707c75-4120-483e-bd16-ef587392f6b7-kube-api-access-rrjvl\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286610 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286635 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57707c75-4120-483e-bd16-ef587392f6b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.286685 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.387956 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57707c75-4120-483e-bd16-ef587392f6b7-logs\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.388033 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-config-data\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.388100 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrjvl\" (UniqueName: \"kubernetes.io/projected/57707c75-4120-483e-bd16-ef587392f6b7-kube-api-access-rrjvl\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.388133 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.388163 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57707c75-4120-483e-bd16-ef587392f6b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.388206 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.388236 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-scripts\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.389128 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57707c75-4120-483e-bd16-ef587392f6b7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.389773 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57707c75-4120-483e-bd16-ef587392f6b7-logs\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.396598 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-config-data-custom\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.397432 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-config-data\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.400122 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.403330 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57707c75-4120-483e-bd16-ef587392f6b7-scripts\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.410338 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrjvl\" (UniqueName: \"kubernetes.io/projected/57707c75-4120-483e-bd16-ef587392f6b7-kube-api-access-rrjvl\") pod \"cinder-api-0\" (UID: \"57707c75-4120-483e-bd16-ef587392f6b7\") " pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.608353 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 11:42:29 crc kubenswrapper[4958]: I1201 11:42:29.810094 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7607cce4-9386-41b0-aa60-4993250abc80" path="/var/lib/kubelet/pods/7607cce4-9386-41b0-aa60-4993250abc80/volumes" Dec 01 11:42:30 crc kubenswrapper[4958]: I1201 11:42:30.158496 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 11:42:30 crc kubenswrapper[4958]: I1201 11:42:30.283038 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:31 crc kubenswrapper[4958]: I1201 11:42:31.074816 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 01 11:42:31 crc kubenswrapper[4958]: I1201 11:42:31.104391 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57707c75-4120-483e-bd16-ef587392f6b7","Type":"ContainerStarted","Data":"ffb47aec02828b187db32608cba7c01ca173b51668411776ffae1dfe41a81533"} Dec 01 11:42:31 crc kubenswrapper[4958]: I1201 11:42:31.104459 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57707c75-4120-483e-bd16-ef587392f6b7","Type":"ContainerStarted","Data":"f472d790349ec71b64e9e6b7ac957a7c462a46d3ce1c550170e262e7c0f1700e"} Dec 01 11:42:32 crc kubenswrapper[4958]: I1201 11:42:32.115549 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"57707c75-4120-483e-bd16-ef587392f6b7","Type":"ContainerStarted","Data":"d6688ab674340920da2407ef1e63762550ffa4279a85a1a947acddfb1fc9e7a7"} Dec 01 11:42:32 crc kubenswrapper[4958]: I1201 11:42:32.115940 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 11:42:32 crc kubenswrapper[4958]: I1201 11:42:32.150302 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.150272507 podStartE2EDuration="3.150272507s" podCreationTimestamp="2025-12-01 11:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:32.137060192 +0000 UTC m=+6199.645849219" watchObservedRunningTime="2025-12-01 11:42:32.150272507 +0000 UTC m=+6199.659061564" Dec 01 11:42:33 crc kubenswrapper[4958]: I1201 11:42:33.158713 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 11:42:33 crc kubenswrapper[4958]: I1201 11:42:33.221765 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:34 crc kubenswrapper[4958]: I1201 11:42:34.134011 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="cinder-scheduler" containerID="cri-o://d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594" gracePeriod=30 Dec 01 11:42:34 crc kubenswrapper[4958]: I1201 11:42:34.134555 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="probe" containerID="cri-o://8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07" gracePeriod=30 Dec 01 11:42:35 crc kubenswrapper[4958]: I1201 11:42:35.146157 4958 generic.go:334] "Generic (PLEG): container finished" podID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerID="8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07" exitCode=0 Dec 01 11:42:35 crc kubenswrapper[4958]: I1201 11:42:35.146209 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f6a6f1b-c5d6-426e-bed7-78a47963612a","Type":"ContainerDied","Data":"8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07"} Dec 01 11:42:35 crc kubenswrapper[4958]: I1201 11:42:35.583554 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 01 11:42:36 crc kubenswrapper[4958]: I1201 11:42:36.333566 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 01 11:42:36 crc kubenswrapper[4958]: I1201 11:42:36.986652 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.101019 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-scripts\") pod \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.101069 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data-custom\") pod \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.101095 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data\") pod \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.101260 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-combined-ca-bundle\") pod \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.101277 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f6a6f1b-c5d6-426e-bed7-78a47963612a-etc-machine-id\") pod \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.101331 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr275\" (UniqueName: \"kubernetes.io/projected/0f6a6f1b-c5d6-426e-bed7-78a47963612a-kube-api-access-kr275\") pod \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\" (UID: \"0f6a6f1b-c5d6-426e-bed7-78a47963612a\") " Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.102651 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f6a6f1b-c5d6-426e-bed7-78a47963612a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f6a6f1b-c5d6-426e-bed7-78a47963612a" (UID: "0f6a6f1b-c5d6-426e-bed7-78a47963612a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.107112 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-scripts" (OuterVolumeSpecName: "scripts") pod "0f6a6f1b-c5d6-426e-bed7-78a47963612a" (UID: "0f6a6f1b-c5d6-426e-bed7-78a47963612a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.121781 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f6a6f1b-c5d6-426e-bed7-78a47963612a" (UID: "0f6a6f1b-c5d6-426e-bed7-78a47963612a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.121808 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6a6f1b-c5d6-426e-bed7-78a47963612a-kube-api-access-kr275" (OuterVolumeSpecName: "kube-api-access-kr275") pod "0f6a6f1b-c5d6-426e-bed7-78a47963612a" (UID: "0f6a6f1b-c5d6-426e-bed7-78a47963612a"). InnerVolumeSpecName "kube-api-access-kr275". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.149682 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f6a6f1b-c5d6-426e-bed7-78a47963612a" (UID: "0f6a6f1b-c5d6-426e-bed7-78a47963612a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.186094 4958 generic.go:334] "Generic (PLEG): container finished" podID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerID="d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594" exitCode=0 Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.186143 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f6a6f1b-c5d6-426e-bed7-78a47963612a","Type":"ContainerDied","Data":"d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594"} Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.186173 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f6a6f1b-c5d6-426e-bed7-78a47963612a","Type":"ContainerDied","Data":"2eec3019f75096a846063ee43124c3291ffc9791e9be1e116575d8a5cf9156d3"} Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.186189 4958 scope.go:117] "RemoveContainer" containerID="8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.186325 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.205244 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr275\" (UniqueName: \"kubernetes.io/projected/0f6a6f1b-c5d6-426e-bed7-78a47963612a-kube-api-access-kr275\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.205288 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.205298 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.205307 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.205316 4958 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f6a6f1b-c5d6-426e-bed7-78a47963612a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.322221 4958 scope.go:117] "RemoveContainer" containerID="d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.328426 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data" (OuterVolumeSpecName: "config-data") pod "0f6a6f1b-c5d6-426e-bed7-78a47963612a" (UID: "0f6a6f1b-c5d6-426e-bed7-78a47963612a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.376094 4958 scope.go:117] "RemoveContainer" containerID="8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07" Dec 01 11:42:37 crc kubenswrapper[4958]: E1201 11:42:37.379406 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07\": container with ID starting with 8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07 not found: ID does not exist" containerID="8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.379464 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07"} err="failed to get container status \"8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07\": rpc error: code = NotFound desc = could not find container \"8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07\": container with ID starting with 8ebbe6f7368ec7d9a07c7f896eaf0a334e36dce907146c5891206ea44afc7c07 not found: ID does not exist" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.379497 4958 scope.go:117] "RemoveContainer" containerID="d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594" Dec 01 11:42:37 crc kubenswrapper[4958]: E1201 11:42:37.381307 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594\": container with ID starting with d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594 not found: ID does not exist" containerID="d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.381352 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594"} err="failed to get container status \"d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594\": rpc error: code = NotFound desc = could not find container \"d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594\": container with ID starting with d021302fb9743c4a153ff96567f09c5c758bef57956703aa07e5f1c3d48d0594 not found: ID does not exist" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.408350 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6a6f1b-c5d6-426e-bed7-78a47963612a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.539298 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.548163 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.560811 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:37 crc kubenswrapper[4958]: E1201 11:42:37.561300 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="cinder-scheduler" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.561326 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="cinder-scheduler" Dec 01 11:42:37 crc kubenswrapper[4958]: E1201 11:42:37.561348 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="probe" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.561355 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="probe" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.561555 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="cinder-scheduler" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.561585 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" containerName="probe" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.562765 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.573498 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.576638 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.612420 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.612472 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-scripts\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.612509 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.612588 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116efa6e-e426-4550-a0e5-c36fbc3f4198-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.612609 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtbrb\" (UniqueName: \"kubernetes.io/projected/116efa6e-e426-4550-a0e5-c36fbc3f4198-kube-api-access-vtbrb\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.612674 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-config-data\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.714318 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-scripts\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.714748 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.714915 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.715105 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116efa6e-e426-4550-a0e5-c36fbc3f4198-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.715220 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtbrb\" (UniqueName: \"kubernetes.io/projected/116efa6e-e426-4550-a0e5-c36fbc3f4198-kube-api-access-vtbrb\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.715365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-config-data\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.715181 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116efa6e-e426-4550-a0e5-c36fbc3f4198-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.718096 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-scripts\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.718247 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.719868 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-config-data\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.720486 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116efa6e-e426-4550-a0e5-c36fbc3f4198-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.732227 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtbrb\" (UniqueName: \"kubernetes.io/projected/116efa6e-e426-4550-a0e5-c36fbc3f4198-kube-api-access-vtbrb\") pod \"cinder-scheduler-0\" (UID: \"116efa6e-e426-4550-a0e5-c36fbc3f4198\") " pod="openstack/cinder-scheduler-0" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.808322 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6a6f1b-c5d6-426e-bed7-78a47963612a" path="/var/lib/kubelet/pods/0f6a6f1b-c5d6-426e-bed7-78a47963612a/volumes" Dec 01 11:42:37 crc kubenswrapper[4958]: I1201 11:42:37.939655 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 11:42:38 crc kubenswrapper[4958]: W1201 11:42:38.417983 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116efa6e_e426_4550_a0e5_c36fbc3f4198.slice/crio-fd3e85eae5adc956c493afe13fd4da7fe0050530d9171e4f966b7dd96285ad54 WatchSource:0}: Error finding container fd3e85eae5adc956c493afe13fd4da7fe0050530d9171e4f966b7dd96285ad54: Status 404 returned error can't find the container with id fd3e85eae5adc956c493afe13fd4da7fe0050530d9171e4f966b7dd96285ad54 Dec 01 11:42:38 crc kubenswrapper[4958]: I1201 11:42:38.418082 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 11:42:39 crc kubenswrapper[4958]: I1201 11:42:39.222657 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116efa6e-e426-4550-a0e5-c36fbc3f4198","Type":"ContainerStarted","Data":"119963a5c2553afd076fa80e4299b10a6fa2ce1b3d7b40bda4ff3e6461e3ea56"} Dec 01 11:42:39 crc kubenswrapper[4958]: I1201 11:42:39.222971 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116efa6e-e426-4550-a0e5-c36fbc3f4198","Type":"ContainerStarted","Data":"fd3e85eae5adc956c493afe13fd4da7fe0050530d9171e4f966b7dd96285ad54"} Dec 01 11:42:40 crc kubenswrapper[4958]: I1201 11:42:40.234389 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116efa6e-e426-4550-a0e5-c36fbc3f4198","Type":"ContainerStarted","Data":"19229c848e7e9de2cd2f2f59694886661d65aa050262047ca33951171d4c3100"} Dec 01 11:42:40 crc kubenswrapper[4958]: I1201 11:42:40.264385 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.264366736 podStartE2EDuration="3.264366736s" podCreationTimestamp="2025-12-01 11:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:42:40.259455707 +0000 UTC m=+6207.768244744" watchObservedRunningTime="2025-12-01 11:42:40.264366736 +0000 UTC m=+6207.773155773" Dec 01 11:42:41 crc kubenswrapper[4958]: I1201 11:42:41.722271 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 11:42:42 crc kubenswrapper[4958]: I1201 11:42:42.942005 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 11:42:48 crc kubenswrapper[4958]: I1201 11:42:48.179687 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.210768 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.211519 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.211608 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.212895 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13cb6cc4ee40481ad64b2a05cd4d173653f521017a8a82d0d6cdc46228d66885"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.213014 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://13cb6cc4ee40481ad64b2a05cd4d173653f521017a8a82d0d6cdc46228d66885" gracePeriod=600 Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.455303 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="13cb6cc4ee40481ad64b2a05cd4d173653f521017a8a82d0d6cdc46228d66885" exitCode=0 Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.455369 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"13cb6cc4ee40481ad64b2a05cd4d173653f521017a8a82d0d6cdc46228d66885"} Dec 01 11:42:58 crc kubenswrapper[4958]: I1201 11:42:58.455405 4958 scope.go:117] "RemoveContainer" containerID="81d2c11cffc93130bd3faacc5b56e064fd5d14be67e26852c8ff0a818fc768dc" Dec 01 11:42:59 crc kubenswrapper[4958]: I1201 11:42:59.472648 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28"} Dec 01 11:43:15 crc kubenswrapper[4958]: I1201 11:43:15.077013 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lz786"] Dec 01 11:43:15 crc kubenswrapper[4958]: I1201 11:43:15.087833 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lz786"] Dec 01 11:43:15 crc kubenswrapper[4958]: I1201 11:43:15.822598 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bd4921-dbde-4dc2-9396-c9892bba83cc" path="/var/lib/kubelet/pods/c6bd4921-dbde-4dc2-9396-c9892bba83cc/volumes" Dec 01 11:43:24 crc kubenswrapper[4958]: I1201 11:43:24.675495 4958 scope.go:117] "RemoveContainer" containerID="7832d518f10a8ce29b134e30b4e0bac59d73d3813683aa11fac3457f0effadaa" Dec 01 11:43:24 crc kubenswrapper[4958]: I1201 11:43:24.710255 4958 scope.go:117] "RemoveContainer" containerID="e398332e9682cf3fe35ab5e59943ed9934ba5cbc0b9d60d5b61a949e68e77e6e" Dec 01 11:43:24 crc kubenswrapper[4958]: I1201 11:43:24.771407 4958 scope.go:117] "RemoveContainer" containerID="d623411e61fe5a380262bc7dbb07f939e024bb06503a30af78500a00bd17d4a9" Dec 01 11:43:25 crc kubenswrapper[4958]: I1201 11:43:25.057417 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4ace-account-create-gpn2t"] Dec 01 11:43:25 crc kubenswrapper[4958]: I1201 11:43:25.068070 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4ace-account-create-gpn2t"] Dec 01 11:43:25 crc kubenswrapper[4958]: I1201 11:43:25.818440 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d392c2fc-42de-4446-98e7-9f6cd36b696d" path="/var/lib/kubelet/pods/d392c2fc-42de-4446-98e7-9f6cd36b696d/volumes" Dec 01 11:43:32 crc kubenswrapper[4958]: I1201 11:43:32.052302 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7g42p"] Dec 01 11:43:32 crc kubenswrapper[4958]: I1201 11:43:32.067987 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7g42p"] Dec 01 11:43:33 crc kubenswrapper[4958]: I1201 11:43:33.812407 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d28e16e-7406-4b48-be5b-8afb83ac7e5b" path="/var/lib/kubelet/pods/7d28e16e-7406-4b48-be5b-8afb83ac7e5b/volumes" Dec 01 11:43:46 crc kubenswrapper[4958]: I1201 11:43:46.053557 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-flqzt"] Dec 01 11:43:46 crc kubenswrapper[4958]: I1201 11:43:46.065244 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-flqzt"] Dec 01 11:43:47 crc kubenswrapper[4958]: I1201 11:43:47.813933 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda" path="/var/lib/kubelet/pods/9777bf20-0ee7-4ac3-bd6e-3e21fdf7bbda/volumes" Dec 01 11:44:24 crc kubenswrapper[4958]: I1201 11:44:24.912685 4958 scope.go:117] "RemoveContainer" containerID="c8c7f98ebcd23e516dd019572137279d96813d84f44c473e20c231f4f1764345" Dec 01 11:44:24 crc kubenswrapper[4958]: I1201 11:44:24.958277 4958 scope.go:117] "RemoveContainer" containerID="eac53bcf8d62ebfe124c5752f43fa899bbf88da6dfb347a889ef8083c7649fff" Dec 01 11:44:25 crc kubenswrapper[4958]: I1201 11:44:25.052946 4958 scope.go:117] "RemoveContainer" containerID="0c01c075269e1b45d04c4992417487ffb1df8638ff12160ee591c14a222803fb" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.248624 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fnvkx"] Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.251157 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.253169 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-p9v5q" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.255104 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.258792 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dzs42"] Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.263592 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.274264 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fnvkx"] Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.285905 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dzs42"] Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.364727 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wmc\" (UniqueName: \"kubernetes.io/projected/2fca0010-c346-4525-8552-489b7bfc0942-kube-api-access-f8wmc\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.364834 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-log-ovn\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365056 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-log\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365111 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fca0010-c346-4525-8552-489b7bfc0942-scripts\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365166 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5v4\" (UniqueName: \"kubernetes.io/projected/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-kube-api-access-8m5v4\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365363 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-run-ovn\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365537 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-scripts\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365595 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-lib\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365653 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-run\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365676 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-run\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.365734 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-etc-ovs\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.467722 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-log-ovn\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.467775 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-log\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.468231 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-log\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.468270 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-log-ovn\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.468331 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fca0010-c346-4525-8552-489b7bfc0942-scripts\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.468437 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5v4\" (UniqueName: \"kubernetes.io/projected/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-kube-api-access-8m5v4\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.468812 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-run-ovn\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.468924 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-run-ovn\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.469020 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-scripts\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.471391 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fca0010-c346-4525-8552-489b7bfc0942-scripts\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.471644 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-scripts\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.471729 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-lib\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.471860 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-lib\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.471756 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-run\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.471933 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-run\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.472024 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-var-run\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.472065 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fca0010-c346-4525-8552-489b7bfc0942-var-run\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.472185 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-etc-ovs\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.472295 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-etc-ovs\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.472423 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wmc\" (UniqueName: \"kubernetes.io/projected/2fca0010-c346-4525-8552-489b7bfc0942-kube-api-access-f8wmc\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.493725 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5v4\" (UniqueName: \"kubernetes.io/projected/68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce-kube-api-access-8m5v4\") pod \"ovn-controller-ovs-dzs42\" (UID: \"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce\") " pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.496415 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wmc\" (UniqueName: \"kubernetes.io/projected/2fca0010-c346-4525-8552-489b7bfc0942-kube-api-access-f8wmc\") pod \"ovn-controller-fnvkx\" (UID: \"2fca0010-c346-4525-8552-489b7bfc0942\") " pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.578718 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:30 crc kubenswrapper[4958]: I1201 11:44:30.594490 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.088877 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fnvkx"] Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.607332 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fnvkx" event={"ID":"2fca0010-c346-4525-8552-489b7bfc0942","Type":"ContainerStarted","Data":"fbe171b8c8b6dcc74ccdb2686be3ef173768a4d2c5400d5d467e6d65f13378de"} Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.607619 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fnvkx" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.607632 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fnvkx" event={"ID":"2fca0010-c346-4525-8552-489b7bfc0942","Type":"ContainerStarted","Data":"240d18747eb6526a7a4fd062593130d1012df7daa613158dd0790eb7f803a0da"} Dec 01 11:44:31 crc kubenswrapper[4958]: W1201 11:44:31.639081 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68a40eb9_b08b_46ae_b1ec_ae7f1a1801ce.slice/crio-8defb25d7374d703a71b8a50fa9cd7623141646837d7660baf72e271daaf50c8 WatchSource:0}: Error finding container 8defb25d7374d703a71b8a50fa9cd7623141646837d7660baf72e271daaf50c8: Status 404 returned error can't find the container with id 8defb25d7374d703a71b8a50fa9cd7623141646837d7660baf72e271daaf50c8 Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.639166 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dzs42"] Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.639766 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fnvkx" podStartSLOduration=1.639749359 podStartE2EDuration="1.639749359s" podCreationTimestamp="2025-12-01 11:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:44:31.626320718 +0000 UTC m=+6319.135109755" watchObservedRunningTime="2025-12-01 11:44:31.639749359 +0000 UTC m=+6319.148538406" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.792827 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rzhwg"] Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.794395 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.799066 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.822202 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzhwg"] Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.900623 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ded0be-0f8e-4c43-9a78-25bc81069adb-config\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.900749 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d9ded0be-0f8e-4c43-9a78-25bc81069adb-ovn-rundir\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.900893 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhk6s\" (UniqueName: \"kubernetes.io/projected/d9ded0be-0f8e-4c43-9a78-25bc81069adb-kube-api-access-rhk6s\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:31 crc kubenswrapper[4958]: I1201 11:44:31.901002 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d9ded0be-0f8e-4c43-9a78-25bc81069adb-ovs-rundir\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.002671 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d9ded0be-0f8e-4c43-9a78-25bc81069adb-ovn-rundir\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.002782 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhk6s\" (UniqueName: \"kubernetes.io/projected/d9ded0be-0f8e-4c43-9a78-25bc81069adb-kube-api-access-rhk6s\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.002829 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d9ded0be-0f8e-4c43-9a78-25bc81069adb-ovs-rundir\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.002896 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ded0be-0f8e-4c43-9a78-25bc81069adb-config\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.003101 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d9ded0be-0f8e-4c43-9a78-25bc81069adb-ovn-rundir\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.003475 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d9ded0be-0f8e-4c43-9a78-25bc81069adb-ovs-rundir\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.003563 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9ded0be-0f8e-4c43-9a78-25bc81069adb-config\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.030454 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhk6s\" (UniqueName: \"kubernetes.io/projected/d9ded0be-0f8e-4c43-9a78-25bc81069adb-kube-api-access-rhk6s\") pod \"ovn-controller-metrics-rzhwg\" (UID: \"d9ded0be-0f8e-4c43-9a78-25bc81069adb\") " pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.126401 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzhwg" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.272334 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-frscf"] Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.280359 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-frscf" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.322408 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-frscf"] Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.419419 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjf8\" (UniqueName: \"kubernetes.io/projected/de61ffcf-b1db-4649-8522-eccbdf42869f-kube-api-access-2pjf8\") pod \"octavia-db-create-frscf\" (UID: \"de61ffcf-b1db-4649-8522-eccbdf42869f\") " pod="openstack/octavia-db-create-frscf" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.521365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjf8\" (UniqueName: \"kubernetes.io/projected/de61ffcf-b1db-4649-8522-eccbdf42869f-kube-api-access-2pjf8\") pod \"octavia-db-create-frscf\" (UID: \"de61ffcf-b1db-4649-8522-eccbdf42869f\") " pod="openstack/octavia-db-create-frscf" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.541259 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjf8\" (UniqueName: \"kubernetes.io/projected/de61ffcf-b1db-4649-8522-eccbdf42869f-kube-api-access-2pjf8\") pod \"octavia-db-create-frscf\" (UID: \"de61ffcf-b1db-4649-8522-eccbdf42869f\") " pod="openstack/octavia-db-create-frscf" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.587590 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzhwg"] Dec 01 11:44:32 crc kubenswrapper[4958]: W1201 11:44:32.590349 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9ded0be_0f8e_4c43_9a78_25bc81069adb.slice/crio-9d3babcb28462908fed45d13332693906c9f619efa467b173543026198db01cb WatchSource:0}: Error finding container 9d3babcb28462908fed45d13332693906c9f619efa467b173543026198db01cb: Status 404 returned error can't find the container with id 9d3babcb28462908fed45d13332693906c9f619efa467b173543026198db01cb Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.606607 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-frscf" Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.639999 4958 generic.go:334] "Generic (PLEG): container finished" podID="68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce" containerID="47fcabc8412415a4489bf608640bf8341afc01695e1fc5560a3f249540ff10af" exitCode=0 Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.640067 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzs42" event={"ID":"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce","Type":"ContainerDied","Data":"47fcabc8412415a4489bf608640bf8341afc01695e1fc5560a3f249540ff10af"} Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.640097 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzs42" event={"ID":"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce","Type":"ContainerStarted","Data":"8defb25d7374d703a71b8a50fa9cd7623141646837d7660baf72e271daaf50c8"} Dec 01 11:44:32 crc kubenswrapper[4958]: I1201 11:44:32.646470 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzhwg" event={"ID":"d9ded0be-0f8e-4c43-9a78-25bc81069adb","Type":"ContainerStarted","Data":"9d3babcb28462908fed45d13332693906c9f619efa467b173543026198db01cb"} Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.108329 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-frscf"] Dec 01 11:44:33 crc kubenswrapper[4958]: W1201 11:44:33.122015 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde61ffcf_b1db_4649_8522_eccbdf42869f.slice/crio-a7d5eec3add852d0fce9d8d6dc7bc4dffc2c579e755b4061deead5a66f5c57a4 WatchSource:0}: Error finding container a7d5eec3add852d0fce9d8d6dc7bc4dffc2c579e755b4061deead5a66f5c57a4: Status 404 returned error can't find the container with id a7d5eec3add852d0fce9d8d6dc7bc4dffc2c579e755b4061deead5a66f5c57a4 Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.660431 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzs42" event={"ID":"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce","Type":"ContainerStarted","Data":"60c77a234a77177404b3faeb5bf3bab851a873b635c7384673c8a03a97427821"} Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.660496 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dzs42" event={"ID":"68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce","Type":"ContainerStarted","Data":"cfd100760654ca6e4974b597463d64420598538a10b162e2522e12bd8c02c37a"} Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.661082 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.661182 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.663715 4958 generic.go:334] "Generic (PLEG): container finished" podID="de61ffcf-b1db-4649-8522-eccbdf42869f" containerID="5da01f982817a71d01781cea60a1a8e7a109f0bbdf407734abd63cc7b68723e6" exitCode=0 Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.663784 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-frscf" event={"ID":"de61ffcf-b1db-4649-8522-eccbdf42869f","Type":"ContainerDied","Data":"5da01f982817a71d01781cea60a1a8e7a109f0bbdf407734abd63cc7b68723e6"} Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.663803 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-frscf" event={"ID":"de61ffcf-b1db-4649-8522-eccbdf42869f","Type":"ContainerStarted","Data":"a7d5eec3add852d0fce9d8d6dc7bc4dffc2c579e755b4061deead5a66f5c57a4"} Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.665821 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzhwg" event={"ID":"d9ded0be-0f8e-4c43-9a78-25bc81069adb","Type":"ContainerStarted","Data":"9ae430948560c8b5124c9772a540608858000da3059532a0dfc10957ffdfefe8"} Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.689386 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dzs42" podStartSLOduration=3.689360252 podStartE2EDuration="3.689360252s" podCreationTimestamp="2025-12-01 11:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:44:33.686879781 +0000 UTC m=+6321.195668838" watchObservedRunningTime="2025-12-01 11:44:33.689360252 +0000 UTC m=+6321.198149309" Dec 01 11:44:33 crc kubenswrapper[4958]: I1201 11:44:33.733609 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rzhwg" podStartSLOduration=2.733589117 podStartE2EDuration="2.733589117s" podCreationTimestamp="2025-12-01 11:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:44:33.726430844 +0000 UTC m=+6321.235219881" watchObservedRunningTime="2025-12-01 11:44:33.733589117 +0000 UTC m=+6321.242378154" Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.136401 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-frscf" Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.178745 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pjf8\" (UniqueName: \"kubernetes.io/projected/de61ffcf-b1db-4649-8522-eccbdf42869f-kube-api-access-2pjf8\") pod \"de61ffcf-b1db-4649-8522-eccbdf42869f\" (UID: \"de61ffcf-b1db-4649-8522-eccbdf42869f\") " Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.189253 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de61ffcf-b1db-4649-8522-eccbdf42869f-kube-api-access-2pjf8" (OuterVolumeSpecName: "kube-api-access-2pjf8") pod "de61ffcf-b1db-4649-8522-eccbdf42869f" (UID: "de61ffcf-b1db-4649-8522-eccbdf42869f"). InnerVolumeSpecName "kube-api-access-2pjf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.280661 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pjf8\" (UniqueName: \"kubernetes.io/projected/de61ffcf-b1db-4649-8522-eccbdf42869f-kube-api-access-2pjf8\") on node \"crc\" DevicePath \"\"" Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.686437 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-frscf" event={"ID":"de61ffcf-b1db-4649-8522-eccbdf42869f","Type":"ContainerDied","Data":"a7d5eec3add852d0fce9d8d6dc7bc4dffc2c579e755b4061deead5a66f5c57a4"} Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.686897 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d5eec3add852d0fce9d8d6dc7bc4dffc2c579e755b4061deead5a66f5c57a4" Dec 01 11:44:35 crc kubenswrapper[4958]: I1201 11:44:35.686673 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-frscf" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.301797 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5f9n"] Dec 01 11:44:40 crc kubenswrapper[4958]: E1201 11:44:40.303472 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de61ffcf-b1db-4649-8522-eccbdf42869f" containerName="mariadb-database-create" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.303517 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="de61ffcf-b1db-4649-8522-eccbdf42869f" containerName="mariadb-database-create" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.304305 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="de61ffcf-b1db-4649-8522-eccbdf42869f" containerName="mariadb-database-create" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.344779 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5f9n"] Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.344969 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.450443 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv867\" (UniqueName: \"kubernetes.io/projected/12642132-efd1-4489-970b-0a188dd93df9-kube-api-access-qv867\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.450679 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-catalog-content\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.450897 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-utilities\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.553204 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-catalog-content\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.553277 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-utilities\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.553413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv867\" (UniqueName: \"kubernetes.io/projected/12642132-efd1-4489-970b-0a188dd93df9-kube-api-access-qv867\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.554048 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-utilities\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.554078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-catalog-content\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.577901 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv867\" (UniqueName: \"kubernetes.io/projected/12642132-efd1-4489-970b-0a188dd93df9-kube-api-access-qv867\") pod \"redhat-marketplace-k5f9n\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:40 crc kubenswrapper[4958]: I1201 11:44:40.678311 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:41 crc kubenswrapper[4958]: I1201 11:44:41.213211 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5f9n"] Dec 01 11:44:41 crc kubenswrapper[4958]: I1201 11:44:41.783626 4958 generic.go:334] "Generic (PLEG): container finished" podID="12642132-efd1-4489-970b-0a188dd93df9" containerID="841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a" exitCode=0 Dec 01 11:44:41 crc kubenswrapper[4958]: I1201 11:44:41.783712 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5f9n" event={"ID":"12642132-efd1-4489-970b-0a188dd93df9","Type":"ContainerDied","Data":"841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a"} Dec 01 11:44:41 crc kubenswrapper[4958]: I1201 11:44:41.784091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5f9n" event={"ID":"12642132-efd1-4489-970b-0a188dd93df9","Type":"ContainerStarted","Data":"1983fe0229e2cdac35a3d672cf0e718b04ece2b5e4badba49a173b6cbd66ef98"} Dec 01 11:44:41 crc kubenswrapper[4958]: I1201 11:44:41.791611 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.457663 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-00e0-account-create-27vdq"] Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.459927 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.464195 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.473751 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-00e0-account-create-27vdq"] Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.523997 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9mc\" (UniqueName: \"kubernetes.io/projected/54ef053b-018e-437a-845e-b8b7737e4ff1-kube-api-access-ss9mc\") pod \"octavia-00e0-account-create-27vdq\" (UID: \"54ef053b-018e-437a-845e-b8b7737e4ff1\") " pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.626916 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9mc\" (UniqueName: \"kubernetes.io/projected/54ef053b-018e-437a-845e-b8b7737e4ff1-kube-api-access-ss9mc\") pod \"octavia-00e0-account-create-27vdq\" (UID: \"54ef053b-018e-437a-845e-b8b7737e4ff1\") " pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.656819 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9mc\" (UniqueName: \"kubernetes.io/projected/54ef053b-018e-437a-845e-b8b7737e4ff1-kube-api-access-ss9mc\") pod \"octavia-00e0-account-create-27vdq\" (UID: \"54ef053b-018e-437a-845e-b8b7737e4ff1\") " pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.797083 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.812429 4958 generic.go:334] "Generic (PLEG): container finished" podID="12642132-efd1-4489-970b-0a188dd93df9" containerID="b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6" exitCode=0 Dec 01 11:44:43 crc kubenswrapper[4958]: I1201 11:44:43.817219 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5f9n" event={"ID":"12642132-efd1-4489-970b-0a188dd93df9","Type":"ContainerDied","Data":"b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6"} Dec 01 11:44:44 crc kubenswrapper[4958]: I1201 11:44:44.413599 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-00e0-account-create-27vdq"] Dec 01 11:44:44 crc kubenswrapper[4958]: I1201 11:44:44.825679 4958 generic.go:334] "Generic (PLEG): container finished" podID="54ef053b-018e-437a-845e-b8b7737e4ff1" containerID="3ca0a9a0ff87991e21d0dcccd5183fedb193bf38f593c5acebdc7410aefa91df" exitCode=0 Dec 01 11:44:44 crc kubenswrapper[4958]: I1201 11:44:44.825747 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-00e0-account-create-27vdq" event={"ID":"54ef053b-018e-437a-845e-b8b7737e4ff1","Type":"ContainerDied","Data":"3ca0a9a0ff87991e21d0dcccd5183fedb193bf38f593c5acebdc7410aefa91df"} Dec 01 11:44:44 crc kubenswrapper[4958]: I1201 11:44:44.826114 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-00e0-account-create-27vdq" event={"ID":"54ef053b-018e-437a-845e-b8b7737e4ff1","Type":"ContainerStarted","Data":"301f1184426745b8591b0617c96e40ec864e8a81ce37b3ef9dca74b0a8a329a3"} Dec 01 11:44:44 crc kubenswrapper[4958]: I1201 11:44:44.828925 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5f9n" event={"ID":"12642132-efd1-4489-970b-0a188dd93df9","Type":"ContainerStarted","Data":"1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db"} Dec 01 11:44:44 crc kubenswrapper[4958]: I1201 11:44:44.868582 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5f9n" podStartSLOduration=2.213421139 podStartE2EDuration="4.868545922s" podCreationTimestamp="2025-12-01 11:44:40 +0000 UTC" firstStartedPulling="2025-12-01 11:44:41.791165442 +0000 UTC m=+6329.299954489" lastFinishedPulling="2025-12-01 11:44:44.446290235 +0000 UTC m=+6331.955079272" observedRunningTime="2025-12-01 11:44:44.860927076 +0000 UTC m=+6332.369716143" watchObservedRunningTime="2025-12-01 11:44:44.868545922 +0000 UTC m=+6332.377334959" Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.235455 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.388315 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9mc\" (UniqueName: \"kubernetes.io/projected/54ef053b-018e-437a-845e-b8b7737e4ff1-kube-api-access-ss9mc\") pod \"54ef053b-018e-437a-845e-b8b7737e4ff1\" (UID: \"54ef053b-018e-437a-845e-b8b7737e4ff1\") " Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.398243 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ef053b-018e-437a-845e-b8b7737e4ff1-kube-api-access-ss9mc" (OuterVolumeSpecName: "kube-api-access-ss9mc") pod "54ef053b-018e-437a-845e-b8b7737e4ff1" (UID: "54ef053b-018e-437a-845e-b8b7737e4ff1"). InnerVolumeSpecName "kube-api-access-ss9mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.492290 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9mc\" (UniqueName: \"kubernetes.io/projected/54ef053b-018e-437a-845e-b8b7737e4ff1-kube-api-access-ss9mc\") on node \"crc\" DevicePath \"\"" Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.849987 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-00e0-account-create-27vdq" event={"ID":"54ef053b-018e-437a-845e-b8b7737e4ff1","Type":"ContainerDied","Data":"301f1184426745b8591b0617c96e40ec864e8a81ce37b3ef9dca74b0a8a329a3"} Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.850066 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="301f1184426745b8591b0617c96e40ec864e8a81ce37b3ef9dca74b0a8a329a3" Dec 01 11:44:46 crc kubenswrapper[4958]: I1201 11:44:46.850107 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-00e0-account-create-27vdq" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.254596 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-9kqxk"] Dec 01 11:44:49 crc kubenswrapper[4958]: E1201 11:44:49.255668 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ef053b-018e-437a-845e-b8b7737e4ff1" containerName="mariadb-account-create" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.255686 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ef053b-018e-437a-845e-b8b7737e4ff1" containerName="mariadb-account-create" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.255914 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ef053b-018e-437a-845e-b8b7737e4ff1" containerName="mariadb-account-create" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.256784 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.295918 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-9kqxk"] Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.379175 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7g9t\" (UniqueName: \"kubernetes.io/projected/b877e60a-c39e-4736-b596-4132c5c8eda8-kube-api-access-g7g9t\") pod \"octavia-persistence-db-create-9kqxk\" (UID: \"b877e60a-c39e-4736-b596-4132c5c8eda8\") " pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.480992 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7g9t\" (UniqueName: \"kubernetes.io/projected/b877e60a-c39e-4736-b596-4132c5c8eda8-kube-api-access-g7g9t\") pod \"octavia-persistence-db-create-9kqxk\" (UID: \"b877e60a-c39e-4736-b596-4132c5c8eda8\") " pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.517950 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7g9t\" (UniqueName: \"kubernetes.io/projected/b877e60a-c39e-4736-b596-4132c5c8eda8-kube-api-access-g7g9t\") pod \"octavia-persistence-db-create-9kqxk\" (UID: \"b877e60a-c39e-4736-b596-4132c5c8eda8\") " pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:49 crc kubenswrapper[4958]: I1201 11:44:49.578356 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.085091 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-9kqxk"] Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.678759 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.679089 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.749508 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.895775 4958 generic.go:334] "Generic (PLEG): container finished" podID="b877e60a-c39e-4736-b596-4132c5c8eda8" containerID="309c1f5bd8aaad2357baf8d68d146cd80a4a9e71e2ea3eec8b47fe85481cbe7d" exitCode=0 Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.895886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-9kqxk" event={"ID":"b877e60a-c39e-4736-b596-4132c5c8eda8","Type":"ContainerDied","Data":"309c1f5bd8aaad2357baf8d68d146cd80a4a9e71e2ea3eec8b47fe85481cbe7d"} Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.895948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-9kqxk" event={"ID":"b877e60a-c39e-4736-b596-4132c5c8eda8","Type":"ContainerStarted","Data":"dafd0fe46420c9ecfac2c857266fe0e46159fa837a5fcefb56dfd5a2b15c12b0"} Dec 01 11:44:50 crc kubenswrapper[4958]: I1201 11:44:50.954513 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:51 crc kubenswrapper[4958]: I1201 11:44:51.005459 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5f9n"] Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.276780 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.455021 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7g9t\" (UniqueName: \"kubernetes.io/projected/b877e60a-c39e-4736-b596-4132c5c8eda8-kube-api-access-g7g9t\") pod \"b877e60a-c39e-4736-b596-4132c5c8eda8\" (UID: \"b877e60a-c39e-4736-b596-4132c5c8eda8\") " Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.464911 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b877e60a-c39e-4736-b596-4132c5c8eda8-kube-api-access-g7g9t" (OuterVolumeSpecName: "kube-api-access-g7g9t") pod "b877e60a-c39e-4736-b596-4132c5c8eda8" (UID: "b877e60a-c39e-4736-b596-4132c5c8eda8"). InnerVolumeSpecName "kube-api-access-g7g9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.557733 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7g9t\" (UniqueName: \"kubernetes.io/projected/b877e60a-c39e-4736-b596-4132c5c8eda8-kube-api-access-g7g9t\") on node \"crc\" DevicePath \"\"" Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.924504 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5f9n" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="registry-server" containerID="cri-o://1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db" gracePeriod=2 Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.925098 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-9kqxk" Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.925978 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-9kqxk" event={"ID":"b877e60a-c39e-4736-b596-4132c5c8eda8","Type":"ContainerDied","Data":"dafd0fe46420c9ecfac2c857266fe0e46159fa837a5fcefb56dfd5a2b15c12b0"} Dec 01 11:44:52 crc kubenswrapper[4958]: I1201 11:44:52.926103 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dafd0fe46420c9ecfac2c857266fe0e46159fa837a5fcefb56dfd5a2b15c12b0" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.681152 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.828666 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv867\" (UniqueName: \"kubernetes.io/projected/12642132-efd1-4489-970b-0a188dd93df9-kube-api-access-qv867\") pod \"12642132-efd1-4489-970b-0a188dd93df9\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.828914 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-catalog-content\") pod \"12642132-efd1-4489-970b-0a188dd93df9\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.828983 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-utilities\") pod \"12642132-efd1-4489-970b-0a188dd93df9\" (UID: \"12642132-efd1-4489-970b-0a188dd93df9\") " Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.837160 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12642132-efd1-4489-970b-0a188dd93df9-kube-api-access-qv867" (OuterVolumeSpecName: "kube-api-access-qv867") pod "12642132-efd1-4489-970b-0a188dd93df9" (UID: "12642132-efd1-4489-970b-0a188dd93df9"). InnerVolumeSpecName "kube-api-access-qv867". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.839686 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-utilities" (OuterVolumeSpecName: "utilities") pod "12642132-efd1-4489-970b-0a188dd93df9" (UID: "12642132-efd1-4489-970b-0a188dd93df9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.868927 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12642132-efd1-4489-970b-0a188dd93df9" (UID: "12642132-efd1-4489-970b-0a188dd93df9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.931679 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.931720 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12642132-efd1-4489-970b-0a188dd93df9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.931755 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv867\" (UniqueName: \"kubernetes.io/projected/12642132-efd1-4489-970b-0a188dd93df9-kube-api-access-qv867\") on node \"crc\" DevicePath \"\"" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.938203 4958 generic.go:334] "Generic (PLEG): container finished" podID="12642132-efd1-4489-970b-0a188dd93df9" containerID="1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db" exitCode=0 Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.938249 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5f9n" event={"ID":"12642132-efd1-4489-970b-0a188dd93df9","Type":"ContainerDied","Data":"1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db"} Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.938276 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5f9n" event={"ID":"12642132-efd1-4489-970b-0a188dd93df9","Type":"ContainerDied","Data":"1983fe0229e2cdac35a3d672cf0e718b04ece2b5e4badba49a173b6cbd66ef98"} Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.938293 4958 scope.go:117] "RemoveContainer" containerID="1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.938423 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5f9n" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.973545 4958 scope.go:117] "RemoveContainer" containerID="b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6" Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.981460 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5f9n"] Dec 01 11:44:53 crc kubenswrapper[4958]: I1201 11:44:53.990315 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5f9n"] Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.007165 4958 scope.go:117] "RemoveContainer" containerID="841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a" Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.055134 4958 scope.go:117] "RemoveContainer" containerID="1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db" Dec 01 11:44:54 crc kubenswrapper[4958]: E1201 11:44:54.055820 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db\": container with ID starting with 1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db not found: ID does not exist" containerID="1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db" Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.055940 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db"} err="failed to get container status \"1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db\": rpc error: code = NotFound desc = could not find container \"1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db\": container with ID starting with 1c9c0cf9fddb0c9efa2d9ad00dd328cfe2bc5d7f82c0b46414f81cd0702200db not found: ID does not exist" Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.055991 4958 scope.go:117] "RemoveContainer" containerID="b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6" Dec 01 11:44:54 crc kubenswrapper[4958]: E1201 11:44:54.056515 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6\": container with ID starting with b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6 not found: ID does not exist" containerID="b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6" Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.056559 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6"} err="failed to get container status \"b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6\": rpc error: code = NotFound desc = could not find container \"b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6\": container with ID starting with b6e5247efc58fb4ec80e7ce40d6b01fca9105fd99780d3ff0afa94a4439e09a6 not found: ID does not exist" Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.056589 4958 scope.go:117] "RemoveContainer" containerID="841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a" Dec 01 11:44:54 crc kubenswrapper[4958]: E1201 11:44:54.057107 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a\": container with ID starting with 841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a not found: ID does not exist" containerID="841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a" Dec 01 11:44:54 crc kubenswrapper[4958]: I1201 11:44:54.057183 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a"} err="failed to get container status \"841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a\": rpc error: code = NotFound desc = could not find container \"841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a\": container with ID starting with 841f301e0e691edc3effbadde8e5e936ef049c6176d3103871c03518caf84c4a not found: ID does not exist" Dec 01 11:44:55 crc kubenswrapper[4958]: I1201 11:44:55.812823 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12642132-efd1-4489-970b-0a188dd93df9" path="/var/lib/kubelet/pods/12642132-efd1-4489-970b-0a188dd93df9/volumes" Dec 01 11:44:58 crc kubenswrapper[4958]: I1201 11:44:58.210990 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:44:58 crc kubenswrapper[4958]: I1201 11:44:58.212059 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.156969 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk"] Dec 01 11:45:00 crc kubenswrapper[4958]: E1201 11:45:00.158205 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="extract-content" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.158236 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="extract-content" Dec 01 11:45:00 crc kubenswrapper[4958]: E1201 11:45:00.158268 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="registry-server" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.158284 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="registry-server" Dec 01 11:45:00 crc kubenswrapper[4958]: E1201 11:45:00.158317 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="extract-utilities" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.158334 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="extract-utilities" Dec 01 11:45:00 crc kubenswrapper[4958]: E1201 11:45:00.158364 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b877e60a-c39e-4736-b596-4132c5c8eda8" containerName="mariadb-database-create" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.158382 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b877e60a-c39e-4736-b596-4132c5c8eda8" containerName="mariadb-database-create" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.158707 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b877e60a-c39e-4736-b596-4132c5c8eda8" containerName="mariadb-database-create" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.158760 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="12642132-efd1-4489-970b-0a188dd93df9" containerName="registry-server" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.160335 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.164316 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.164959 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.171522 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk"] Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.210120 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f84411-fc29-413f-bdc8-9da0ec321a21-secret-volume\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.210231 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f84411-fc29-413f-bdc8-9da0ec321a21-config-volume\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.210340 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rrld\" (UniqueName: \"kubernetes.io/projected/27f84411-fc29-413f-bdc8-9da0ec321a21-kube-api-access-5rrld\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.298678 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-a8b4-account-create-x8xc2"] Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.301179 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.304221 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.307546 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-a8b4-account-create-x8xc2"] Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.312514 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f84411-fc29-413f-bdc8-9da0ec321a21-config-volume\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.313077 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rrld\" (UniqueName: \"kubernetes.io/projected/27f84411-fc29-413f-bdc8-9da0ec321a21-kube-api-access-5rrld\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.313368 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f84411-fc29-413f-bdc8-9da0ec321a21-secret-volume\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.316100 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f84411-fc29-413f-bdc8-9da0ec321a21-config-volume\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.322575 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f84411-fc29-413f-bdc8-9da0ec321a21-secret-volume\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.343718 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rrld\" (UniqueName: \"kubernetes.io/projected/27f84411-fc29-413f-bdc8-9da0ec321a21-kube-api-access-5rrld\") pod \"collect-profiles-29409825-v2plk\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.415057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5qc8\" (UniqueName: \"kubernetes.io/projected/3f02d2c8-00f2-4441-b4e4-30c870db359d-kube-api-access-s5qc8\") pod \"octavia-a8b4-account-create-x8xc2\" (UID: \"3f02d2c8-00f2-4441-b4e4-30c870db359d\") " pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.495131 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.517961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5qc8\" (UniqueName: \"kubernetes.io/projected/3f02d2c8-00f2-4441-b4e4-30c870db359d-kube-api-access-s5qc8\") pod \"octavia-a8b4-account-create-x8xc2\" (UID: \"3f02d2c8-00f2-4441-b4e4-30c870db359d\") " pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.536122 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5qc8\" (UniqueName: \"kubernetes.io/projected/3f02d2c8-00f2-4441-b4e4-30c870db359d-kube-api-access-s5qc8\") pod \"octavia-a8b4-account-create-x8xc2\" (UID: \"3f02d2c8-00f2-4441-b4e4-30c870db359d\") " pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:00 crc kubenswrapper[4958]: I1201 11:45:00.733527 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:01 crc kubenswrapper[4958]: I1201 11:45:01.087358 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk"] Dec 01 11:45:01 crc kubenswrapper[4958]: I1201 11:45:01.185129 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" event={"ID":"27f84411-fc29-413f-bdc8-9da0ec321a21","Type":"ContainerStarted","Data":"1c8080087b70047275f52fe36d352e9164fba6e83046726c68517c0667707f04"} Dec 01 11:45:01 crc kubenswrapper[4958]: I1201 11:45:01.216012 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-a8b4-account-create-x8xc2"] Dec 01 11:45:01 crc kubenswrapper[4958]: W1201 11:45:01.217945 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f02d2c8_00f2_4441_b4e4_30c870db359d.slice/crio-6904f86d122e3f4a125f54588f3fbcb1e35ed6af0cb6aa7542dc826d60a65ecf WatchSource:0}: Error finding container 6904f86d122e3f4a125f54588f3fbcb1e35ed6af0cb6aa7542dc826d60a65ecf: Status 404 returned error can't find the container with id 6904f86d122e3f4a125f54588f3fbcb1e35ed6af0cb6aa7542dc826d60a65ecf Dec 01 11:45:02 crc kubenswrapper[4958]: I1201 11:45:02.196395 4958 generic.go:334] "Generic (PLEG): container finished" podID="3f02d2c8-00f2-4441-b4e4-30c870db359d" containerID="2580d952f013912946c78e2cbbc67769e55b1ec3539fe38e19a2a54eded105b5" exitCode=0 Dec 01 11:45:02 crc kubenswrapper[4958]: I1201 11:45:02.196474 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a8b4-account-create-x8xc2" event={"ID":"3f02d2c8-00f2-4441-b4e4-30c870db359d","Type":"ContainerDied","Data":"2580d952f013912946c78e2cbbc67769e55b1ec3539fe38e19a2a54eded105b5"} Dec 01 11:45:02 crc kubenswrapper[4958]: I1201 11:45:02.197041 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a8b4-account-create-x8xc2" event={"ID":"3f02d2c8-00f2-4441-b4e4-30c870db359d","Type":"ContainerStarted","Data":"6904f86d122e3f4a125f54588f3fbcb1e35ed6af0cb6aa7542dc826d60a65ecf"} Dec 01 11:45:02 crc kubenswrapper[4958]: I1201 11:45:02.199345 4958 generic.go:334] "Generic (PLEG): container finished" podID="27f84411-fc29-413f-bdc8-9da0ec321a21" containerID="70438da6be65a8197c4540898600a3cf237b161ab7fc8c041dc9adfd5ba154b7" exitCode=0 Dec 01 11:45:02 crc kubenswrapper[4958]: I1201 11:45:02.199401 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" event={"ID":"27f84411-fc29-413f-bdc8-9da0ec321a21","Type":"ContainerDied","Data":"70438da6be65a8197c4540898600a3cf237b161ab7fc8c041dc9adfd5ba154b7"} Dec 01 11:45:03 crc kubenswrapper[4958]: E1201 11:45:03.511631 4958 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.676689 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.685296 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.870270 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5qc8\" (UniqueName: \"kubernetes.io/projected/3f02d2c8-00f2-4441-b4e4-30c870db359d-kube-api-access-s5qc8\") pod \"3f02d2c8-00f2-4441-b4e4-30c870db359d\" (UID: \"3f02d2c8-00f2-4441-b4e4-30c870db359d\") " Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.870331 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rrld\" (UniqueName: \"kubernetes.io/projected/27f84411-fc29-413f-bdc8-9da0ec321a21-kube-api-access-5rrld\") pod \"27f84411-fc29-413f-bdc8-9da0ec321a21\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.870536 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f84411-fc29-413f-bdc8-9da0ec321a21-config-volume\") pod \"27f84411-fc29-413f-bdc8-9da0ec321a21\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.870680 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f84411-fc29-413f-bdc8-9da0ec321a21-secret-volume\") pod \"27f84411-fc29-413f-bdc8-9da0ec321a21\" (UID: \"27f84411-fc29-413f-bdc8-9da0ec321a21\") " Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.871357 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f84411-fc29-413f-bdc8-9da0ec321a21-config-volume" (OuterVolumeSpecName: "config-volume") pod "27f84411-fc29-413f-bdc8-9da0ec321a21" (UID: "27f84411-fc29-413f-bdc8-9da0ec321a21"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.872203 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f84411-fc29-413f-bdc8-9da0ec321a21-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.877509 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f84411-fc29-413f-bdc8-9da0ec321a21-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27f84411-fc29-413f-bdc8-9da0ec321a21" (UID: "27f84411-fc29-413f-bdc8-9da0ec321a21"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.878074 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f84411-fc29-413f-bdc8-9da0ec321a21-kube-api-access-5rrld" (OuterVolumeSpecName: "kube-api-access-5rrld") pod "27f84411-fc29-413f-bdc8-9da0ec321a21" (UID: "27f84411-fc29-413f-bdc8-9da0ec321a21"). InnerVolumeSpecName "kube-api-access-5rrld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.879065 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f02d2c8-00f2-4441-b4e4-30c870db359d-kube-api-access-s5qc8" (OuterVolumeSpecName: "kube-api-access-s5qc8") pod "3f02d2c8-00f2-4441-b4e4-30c870db359d" (UID: "3f02d2c8-00f2-4441-b4e4-30c870db359d"). InnerVolumeSpecName "kube-api-access-s5qc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.975353 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f84411-fc29-413f-bdc8-9da0ec321a21-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.975413 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5qc8\" (UniqueName: \"kubernetes.io/projected/3f02d2c8-00f2-4441-b4e4-30c870db359d-kube-api-access-s5qc8\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:03 crc kubenswrapper[4958]: I1201 11:45:03.975433 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rrld\" (UniqueName: \"kubernetes.io/projected/27f84411-fc29-413f-bdc8-9da0ec321a21-kube-api-access-5rrld\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.233975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-a8b4-account-create-x8xc2" event={"ID":"3f02d2c8-00f2-4441-b4e4-30c870db359d","Type":"ContainerDied","Data":"6904f86d122e3f4a125f54588f3fbcb1e35ed6af0cb6aa7542dc826d60a65ecf"} Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.234073 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6904f86d122e3f4a125f54588f3fbcb1e35ed6af0cb6aa7542dc826d60a65ecf" Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.233994 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-a8b4-account-create-x8xc2" Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.237127 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" event={"ID":"27f84411-fc29-413f-bdc8-9da0ec321a21","Type":"ContainerDied","Data":"1c8080087b70047275f52fe36d352e9164fba6e83046726c68517c0667707f04"} Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.237188 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c8080087b70047275f52fe36d352e9164fba6e83046726c68517c0667707f04" Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.237189 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk" Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.858717 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq"] Dec 01 11:45:04 crc kubenswrapper[4958]: I1201 11:45:04.875387 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409780-csmkq"] Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.670909 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fnvkx" podUID="2fca0010-c346-4525-8552-489b7bfc0942" containerName="ovn-controller" probeResult="failure" output=< Dec 01 11:45:05 crc kubenswrapper[4958]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 11:45:05 crc kubenswrapper[4958]: > Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.674768 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.705687 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dzs42" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.813487 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a4de06-5133-4aa6-8fdd-5c6f803cac93" path="/var/lib/kubelet/pods/47a4de06-5133-4aa6-8fdd-5c6f803cac93/volumes" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.824006 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fnvkx-config-vfr8m"] Dec 01 11:45:05 crc kubenswrapper[4958]: E1201 11:45:05.824640 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f02d2c8-00f2-4441-b4e4-30c870db359d" containerName="mariadb-account-create" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.824674 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f02d2c8-00f2-4441-b4e4-30c870db359d" containerName="mariadb-account-create" Dec 01 11:45:05 crc kubenswrapper[4958]: E1201 11:45:05.824714 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f84411-fc29-413f-bdc8-9da0ec321a21" containerName="collect-profiles" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.824728 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f84411-fc29-413f-bdc8-9da0ec321a21" containerName="collect-profiles" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.825116 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f84411-fc29-413f-bdc8-9da0ec321a21" containerName="collect-profiles" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.825153 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f02d2c8-00f2-4441-b4e4-30c870db359d" containerName="mariadb-account-create" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.826266 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.828960 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.838165 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fnvkx-config-vfr8m"] Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.918369 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfc7h\" (UniqueName: \"kubernetes.io/projected/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-kube-api-access-rfc7h\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.918514 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-additional-scripts\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.918582 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.918617 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-log-ovn\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.919042 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-scripts\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:05 crc kubenswrapper[4958]: I1201 11:45:05.919197 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run-ovn\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.021313 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfc7h\" (UniqueName: \"kubernetes.io/projected/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-kube-api-access-rfc7h\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.021661 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-additional-scripts\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.021873 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022041 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-log-ovn\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022225 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-scripts\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022417 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run-ovn\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022528 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run-ovn\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022247 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022271 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-log-ovn\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.022629 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-additional-scripts\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.025434 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-scripts\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.040863 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfc7h\" (UniqueName: \"kubernetes.io/projected/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-kube-api-access-rfc7h\") pod \"ovn-controller-fnvkx-config-vfr8m\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.152920 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.378048 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-f54b64c4-kbt49"] Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.380456 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.395577 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.396238 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.396373 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-v8chc" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.398942 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-f54b64c4-kbt49"] Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.434334 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-config-data\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.434811 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/feeff732-b940-4c31-963e-08785067df77-config-data-merged\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.434950 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-combined-ca-bundle\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.435137 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-scripts\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.435277 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/feeff732-b940-4c31-963e-08785067df77-octavia-run\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.536135 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/feeff732-b940-4c31-963e-08785067df77-config-data-merged\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.536200 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-combined-ca-bundle\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.536260 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-scripts\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.536306 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/feeff732-b940-4c31-963e-08785067df77-octavia-run\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.536341 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-config-data\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.546440 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/feeff732-b940-4c31-963e-08785067df77-config-data-merged\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.555303 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/feeff732-b940-4c31-963e-08785067df77-octavia-run\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.560887 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-scripts\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.561453 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-config-data\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.583139 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeff732-b940-4c31-963e-08785067df77-combined-ca-bundle\") pod \"octavia-api-f54b64c4-kbt49\" (UID: \"feeff732-b940-4c31-963e-08785067df77\") " pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.704262 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fnvkx-config-vfr8m"] Dec 01 11:45:06 crc kubenswrapper[4958]: I1201 11:45:06.721723 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:07 crc kubenswrapper[4958]: I1201 11:45:07.284556 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-f54b64c4-kbt49"] Dec 01 11:45:07 crc kubenswrapper[4958]: I1201 11:45:07.285169 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fnvkx-config-vfr8m" event={"ID":"81c7a507-1a2f-4711-a2e3-dc8a7261ec69","Type":"ContainerStarted","Data":"13d722706dcbd75e260252df7ceb9ed529d78680913a5efd0945b3d898eb8b49"} Dec 01 11:45:07 crc kubenswrapper[4958]: I1201 11:45:07.285195 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fnvkx-config-vfr8m" event={"ID":"81c7a507-1a2f-4711-a2e3-dc8a7261ec69","Type":"ContainerStarted","Data":"3dc792d3cdb1ab5a8b02114ef934fa0c16f93d377c376978d5be8c23059714be"} Dec 01 11:45:07 crc kubenswrapper[4958]: I1201 11:45:07.304923 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fnvkx-config-vfr8m" podStartSLOduration=2.304898596 podStartE2EDuration="2.304898596s" podCreationTimestamp="2025-12-01 11:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:45:07.302956961 +0000 UTC m=+6354.811746008" watchObservedRunningTime="2025-12-01 11:45:07.304898596 +0000 UTC m=+6354.813687643" Dec 01 11:45:08 crc kubenswrapper[4958]: I1201 11:45:08.295656 4958 generic.go:334] "Generic (PLEG): container finished" podID="81c7a507-1a2f-4711-a2e3-dc8a7261ec69" containerID="13d722706dcbd75e260252df7ceb9ed529d78680913a5efd0945b3d898eb8b49" exitCode=0 Dec 01 11:45:08 crc kubenswrapper[4958]: I1201 11:45:08.295778 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fnvkx-config-vfr8m" event={"ID":"81c7a507-1a2f-4711-a2e3-dc8a7261ec69","Type":"ContainerDied","Data":"13d722706dcbd75e260252df7ceb9ed529d78680913a5efd0945b3d898eb8b49"} Dec 01 11:45:08 crc kubenswrapper[4958]: I1201 11:45:08.298916 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-f54b64c4-kbt49" event={"ID":"feeff732-b940-4c31-963e-08785067df77","Type":"ContainerStarted","Data":"05675084cbed45ec7e05707329b6d09f05cd89f9116c6b0744f99683bc58cf3b"} Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.670591 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.755888 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-additional-scripts\") pod \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.755989 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-log-ovn\") pod \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.756195 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "81c7a507-1a2f-4711-a2e3-dc8a7261ec69" (UID: "81c7a507-1a2f-4711-a2e3-dc8a7261ec69"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.757023 4958 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.757457 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "81c7a507-1a2f-4711-a2e3-dc8a7261ec69" (UID: "81c7a507-1a2f-4711-a2e3-dc8a7261ec69"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.858564 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run\") pod \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.858628 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run-ovn\") pod \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.858737 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfc7h\" (UniqueName: \"kubernetes.io/projected/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-kube-api-access-rfc7h\") pod \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.858709 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run" (OuterVolumeSpecName: "var-run") pod "81c7a507-1a2f-4711-a2e3-dc8a7261ec69" (UID: "81c7a507-1a2f-4711-a2e3-dc8a7261ec69"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.858765 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-scripts\") pod \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\" (UID: \"81c7a507-1a2f-4711-a2e3-dc8a7261ec69\") " Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.859151 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "81c7a507-1a2f-4711-a2e3-dc8a7261ec69" (UID: "81c7a507-1a2f-4711-a2e3-dc8a7261ec69"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.859760 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-scripts" (OuterVolumeSpecName: "scripts") pod "81c7a507-1a2f-4711-a2e3-dc8a7261ec69" (UID: "81c7a507-1a2f-4711-a2e3-dc8a7261ec69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.860706 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.860754 4958 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.860770 4958 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.860780 4958 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.867708 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-kube-api-access-rfc7h" (OuterVolumeSpecName: "kube-api-access-rfc7h") pod "81c7a507-1a2f-4711-a2e3-dc8a7261ec69" (UID: "81c7a507-1a2f-4711-a2e3-dc8a7261ec69"). InnerVolumeSpecName "kube-api-access-rfc7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:45:09 crc kubenswrapper[4958]: I1201 11:45:09.964759 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfc7h\" (UniqueName: \"kubernetes.io/projected/81c7a507-1a2f-4711-a2e3-dc8a7261ec69-kube-api-access-rfc7h\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:10 crc kubenswrapper[4958]: I1201 11:45:10.323728 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fnvkx-config-vfr8m" event={"ID":"81c7a507-1a2f-4711-a2e3-dc8a7261ec69","Type":"ContainerDied","Data":"3dc792d3cdb1ab5a8b02114ef934fa0c16f93d377c376978d5be8c23059714be"} Dec 01 11:45:10 crc kubenswrapper[4958]: I1201 11:45:10.323754 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fnvkx-config-vfr8m" Dec 01 11:45:10 crc kubenswrapper[4958]: I1201 11:45:10.323774 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dc792d3cdb1ab5a8b02114ef934fa0c16f93d377c376978d5be8c23059714be" Dec 01 11:45:10 crc kubenswrapper[4958]: I1201 11:45:10.399348 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fnvkx-config-vfr8m"] Dec 01 11:45:10 crc kubenswrapper[4958]: I1201 11:45:10.416876 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fnvkx-config-vfr8m"] Dec 01 11:45:10 crc kubenswrapper[4958]: I1201 11:45:10.642938 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fnvkx" Dec 01 11:45:11 crc kubenswrapper[4958]: I1201 11:45:11.811796 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c7a507-1a2f-4711-a2e3-dc8a7261ec69" path="/var/lib/kubelet/pods/81c7a507-1a2f-4711-a2e3-dc8a7261ec69/volumes" Dec 01 11:45:19 crc kubenswrapper[4958]: I1201 11:45:19.512951 4958 generic.go:334] "Generic (PLEG): container finished" podID="feeff732-b940-4c31-963e-08785067df77" containerID="9676a9d434698ad68641c219fa97d0f4b7f8ecf3d0a4081be7fb2852d07fac30" exitCode=0 Dec 01 11:45:19 crc kubenswrapper[4958]: I1201 11:45:19.513043 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-f54b64c4-kbt49" event={"ID":"feeff732-b940-4c31-963e-08785067df77","Type":"ContainerDied","Data":"9676a9d434698ad68641c219fa97d0f4b7f8ecf3d0a4081be7fb2852d07fac30"} Dec 01 11:45:20 crc kubenswrapper[4958]: I1201 11:45:20.525673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-f54b64c4-kbt49" event={"ID":"feeff732-b940-4c31-963e-08785067df77","Type":"ContainerStarted","Data":"292f8c9b9031b452ea2fc77461c28da3e100b729872aaaf9030d0358cadc7637"} Dec 01 11:45:20 crc kubenswrapper[4958]: I1201 11:45:20.526254 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-f54b64c4-kbt49" event={"ID":"feeff732-b940-4c31-963e-08785067df77","Type":"ContainerStarted","Data":"fdfc37e2150c3c4c701cc15f7783296fc82fc311d00d6a0e60aa8c655b1a2686"} Dec 01 11:45:20 crc kubenswrapper[4958]: I1201 11:45:20.527312 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:20 crc kubenswrapper[4958]: I1201 11:45:20.527372 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:20 crc kubenswrapper[4958]: I1201 11:45:20.560904 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-f54b64c4-kbt49" podStartSLOduration=3.61023491 podStartE2EDuration="14.560879821s" podCreationTimestamp="2025-12-01 11:45:06 +0000 UTC" firstStartedPulling="2025-12-01 11:45:07.286664709 +0000 UTC m=+6354.795453746" lastFinishedPulling="2025-12-01 11:45:18.23730962 +0000 UTC m=+6365.746098657" observedRunningTime="2025-12-01 11:45:20.549452006 +0000 UTC m=+6368.058241043" watchObservedRunningTime="2025-12-01 11:45:20.560879821 +0000 UTC m=+6368.069668868" Dec 01 11:45:25 crc kubenswrapper[4958]: I1201 11:45:25.203274 4958 scope.go:117] "RemoveContainer" containerID="36953d4da1f3605db037fee336308e72a645085b1702ece2a073a1527b17abba" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.210615 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.210974 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.841915 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-bldjw"] Dec 01 11:45:28 crc kubenswrapper[4958]: E1201 11:45:28.842799 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c7a507-1a2f-4711-a2e3-dc8a7261ec69" containerName="ovn-config" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.842828 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c7a507-1a2f-4711-a2e3-dc8a7261ec69" containerName="ovn-config" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.843133 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c7a507-1a2f-4711-a2e3-dc8a7261ec69" containerName="ovn-config" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.844532 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.846949 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.847705 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.847927 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.852401 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-bldjw"] Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.882140 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-hm-ports\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.882219 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-scripts\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.882326 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-config-data\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.882358 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-config-data-merged\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.983800 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-hm-ports\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.983931 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-scripts\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.984006 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-config-data\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.984053 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-config-data-merged\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.984615 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-config-data-merged\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.984881 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-hm-ports\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.990527 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-config-data\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:28 crc kubenswrapper[4958]: I1201 11:45:28.990814 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f3052f9-acb3-4c7e-b98d-773b0509ccb7-scripts\") pod \"octavia-rsyslog-bldjw\" (UID: \"5f3052f9-acb3-4c7e-b98d-773b0509ccb7\") " pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.164266 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.454368 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-hm846"] Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.456891 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.459494 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.471159 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-hm846"] Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.596836 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3f18c40-b6d6-48e0-b976-6083ced0d56b-amphora-image\") pod \"octavia-image-upload-59f8cff499-hm846\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.597503 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f18c40-b6d6-48e0-b976-6083ced0d56b-httpd-config\") pod \"octavia-image-upload-59f8cff499-hm846\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.699441 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f18c40-b6d6-48e0-b976-6083ced0d56b-httpd-config\") pod \"octavia-image-upload-59f8cff499-hm846\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.699884 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3f18c40-b6d6-48e0-b976-6083ced0d56b-amphora-image\") pod \"octavia-image-upload-59f8cff499-hm846\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.700474 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3f18c40-b6d6-48e0-b976-6083ced0d56b-amphora-image\") pod \"octavia-image-upload-59f8cff499-hm846\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.711649 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f18c40-b6d6-48e0-b976-6083ced0d56b-httpd-config\") pod \"octavia-image-upload-59f8cff499-hm846\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.749523 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-bldjw"] Dec 01 11:45:29 crc kubenswrapper[4958]: I1201 11:45:29.778166 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.194907 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-r9jsq"] Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.197117 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.207183 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-r9jsq"] Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.227074 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.228765 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.228826 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-scripts\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.228894 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data-merged\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.229010 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-combined-ca-bundle\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.281485 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-hm846"] Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.330357 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.330443 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-scripts\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.330740 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data-merged\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.330837 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-combined-ca-bundle\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.331956 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data-merged\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.335666 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-combined-ca-bundle\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.336416 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-scripts\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.337272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data\") pod \"octavia-db-sync-r9jsq\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.546952 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.674778 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-bldjw" event={"ID":"5f3052f9-acb3-4c7e-b98d-773b0509ccb7","Type":"ContainerStarted","Data":"2d5e3a8d5de7f3cd61b61045f6f869237948b2e23623d6abbfe41ca1af8e6c31"} Dec 01 11:45:30 crc kubenswrapper[4958]: I1201 11:45:30.679457 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-hm846" event={"ID":"c3f18c40-b6d6-48e0-b976-6083ced0d56b","Type":"ContainerStarted","Data":"e62b0a3632efb252563ff6eb376115b5529e2bafc5665d04867cf098407323a1"} Dec 01 11:45:31 crc kubenswrapper[4958]: I1201 11:45:31.080200 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-r9jsq"] Dec 01 11:45:31 crc kubenswrapper[4958]: W1201 11:45:31.084738 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b2c2cb_9590_403a_9833_67d909640ca6.slice/crio-e3f83799ac1f7ea695e01fe45c805d0aae14d3f78940696114efd97381e4edeb WatchSource:0}: Error finding container e3f83799ac1f7ea695e01fe45c805d0aae14d3f78940696114efd97381e4edeb: Status 404 returned error can't find the container with id e3f83799ac1f7ea695e01fe45c805d0aae14d3f78940696114efd97381e4edeb Dec 01 11:45:31 crc kubenswrapper[4958]: I1201 11:45:31.690596 4958 generic.go:334] "Generic (PLEG): container finished" podID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerID="da1e2215781331137ea919ac2ae9b4f0b0e99d26696c917ebf7429cfa3ad8c89" exitCode=0 Dec 01 11:45:31 crc kubenswrapper[4958]: I1201 11:45:31.690687 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r9jsq" event={"ID":"e2b2c2cb-9590-403a-9833-67d909640ca6","Type":"ContainerDied","Data":"da1e2215781331137ea919ac2ae9b4f0b0e99d26696c917ebf7429cfa3ad8c89"} Dec 01 11:45:31 crc kubenswrapper[4958]: I1201 11:45:31.690962 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r9jsq" event={"ID":"e2b2c2cb-9590-403a-9833-67d909640ca6","Type":"ContainerStarted","Data":"e3f83799ac1f7ea695e01fe45c805d0aae14d3f78940696114efd97381e4edeb"} Dec 01 11:45:34 crc kubenswrapper[4958]: I1201 11:45:34.732862 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r9jsq" event={"ID":"e2b2c2cb-9590-403a-9833-67d909640ca6","Type":"ContainerStarted","Data":"f69d09c32395908db469fb1c0ae9e4ab0af16606705ba76e3fa9838518b539c6"} Dec 01 11:45:34 crc kubenswrapper[4958]: I1201 11:45:34.755564 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-r9jsq" podStartSLOduration=4.755483231 podStartE2EDuration="4.755483231s" podCreationTimestamp="2025-12-01 11:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:45:34.751282052 +0000 UTC m=+6382.260071109" watchObservedRunningTime="2025-12-01 11:45:34.755483231 +0000 UTC m=+6382.264272268" Dec 01 11:45:35 crc kubenswrapper[4958]: I1201 11:45:35.772431 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-bldjw" event={"ID":"5f3052f9-acb3-4c7e-b98d-773b0509ccb7","Type":"ContainerStarted","Data":"766da8d5da2e0f93e69037abab2e606027ef7b78f94ea1f37630fc3c3bf3368e"} Dec 01 11:45:37 crc kubenswrapper[4958]: I1201 11:45:37.793791 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r9jsq" event={"ID":"e2b2c2cb-9590-403a-9833-67d909640ca6","Type":"ContainerDied","Data":"f69d09c32395908db469fb1c0ae9e4ab0af16606705ba76e3fa9838518b539c6"} Dec 01 11:45:37 crc kubenswrapper[4958]: I1201 11:45:37.793860 4958 generic.go:334] "Generic (PLEG): container finished" podID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerID="f69d09c32395908db469fb1c0ae9e4ab0af16606705ba76e3fa9838518b539c6" exitCode=0 Dec 01 11:45:37 crc kubenswrapper[4958]: I1201 11:45:37.798446 4958 generic.go:334] "Generic (PLEG): container finished" podID="5f3052f9-acb3-4c7e-b98d-773b0509ccb7" containerID="766da8d5da2e0f93e69037abab2e606027ef7b78f94ea1f37630fc3c3bf3368e" exitCode=0 Dec 01 11:45:37 crc kubenswrapper[4958]: I1201 11:45:37.827981 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-bldjw" event={"ID":"5f3052f9-acb3-4c7e-b98d-773b0509ccb7","Type":"ContainerDied","Data":"766da8d5da2e0f93e69037abab2e606027ef7b78f94ea1f37630fc3c3bf3368e"} Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.336270 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.468400 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-scripts\") pod \"e2b2c2cb-9590-403a-9833-67d909640ca6\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.468682 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data-merged\") pod \"e2b2c2cb-9590-403a-9833-67d909640ca6\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.468706 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-combined-ca-bundle\") pod \"e2b2c2cb-9590-403a-9833-67d909640ca6\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.468790 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data\") pod \"e2b2c2cb-9590-403a-9833-67d909640ca6\" (UID: \"e2b2c2cb-9590-403a-9833-67d909640ca6\") " Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.493142 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-scripts" (OuterVolumeSpecName: "scripts") pod "e2b2c2cb-9590-403a-9833-67d909640ca6" (UID: "e2b2c2cb-9590-403a-9833-67d909640ca6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.495370 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data" (OuterVolumeSpecName: "config-data") pod "e2b2c2cb-9590-403a-9833-67d909640ca6" (UID: "e2b2c2cb-9590-403a-9833-67d909640ca6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.500162 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2b2c2cb-9590-403a-9833-67d909640ca6" (UID: "e2b2c2cb-9590-403a-9833-67d909640ca6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.520660 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "e2b2c2cb-9590-403a-9833-67d909640ca6" (UID: "e2b2c2cb-9590-403a-9833-67d909640ca6"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.617023 4958 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.617150 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.619020 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.619100 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b2c2cb-9590-403a-9833-67d909640ca6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.850442 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-r9jsq" event={"ID":"e2b2c2cb-9590-403a-9833-67d909640ca6","Type":"ContainerDied","Data":"e3f83799ac1f7ea695e01fe45c805d0aae14d3f78940696114efd97381e4edeb"} Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.850704 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f83799ac1f7ea695e01fe45c805d0aae14d3f78940696114efd97381e4edeb" Dec 01 11:45:40 crc kubenswrapper[4958]: I1201 11:45:40.850536 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-r9jsq" Dec 01 11:45:41 crc kubenswrapper[4958]: I1201 11:45:41.267992 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:41 crc kubenswrapper[4958]: I1201 11:45:41.579431 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-f54b64c4-kbt49" Dec 01 11:45:41 crc kubenswrapper[4958]: I1201 11:45:41.862147 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-hm846" event={"ID":"c3f18c40-b6d6-48e0-b976-6083ced0d56b","Type":"ContainerStarted","Data":"4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae"} Dec 01 11:45:42 crc kubenswrapper[4958]: I1201 11:45:42.886000 4958 generic.go:334] "Generic (PLEG): container finished" podID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerID="4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae" exitCode=0 Dec 01 11:45:42 crc kubenswrapper[4958]: I1201 11:45:42.886326 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-hm846" event={"ID":"c3f18c40-b6d6-48e0-b976-6083ced0d56b","Type":"ContainerDied","Data":"4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae"} Dec 01 11:45:42 crc kubenswrapper[4958]: I1201 11:45:42.890536 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-bldjw" event={"ID":"5f3052f9-acb3-4c7e-b98d-773b0509ccb7","Type":"ContainerStarted","Data":"e130670cce4993ae83d8bdb94801650addc23d8015dc36838783fa9f3cb88fcc"} Dec 01 11:45:42 crc kubenswrapper[4958]: I1201 11:45:42.891278 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:45:42 crc kubenswrapper[4958]: I1201 11:45:42.938183 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-bldjw" podStartSLOduration=3.237984377 podStartE2EDuration="14.938166027s" podCreationTimestamp="2025-12-01 11:45:28 +0000 UTC" firstStartedPulling="2025-12-01 11:45:29.792438102 +0000 UTC m=+6377.301227139" lastFinishedPulling="2025-12-01 11:45:41.492619752 +0000 UTC m=+6389.001408789" observedRunningTime="2025-12-01 11:45:42.933033832 +0000 UTC m=+6390.441822859" watchObservedRunningTime="2025-12-01 11:45:42.938166027 +0000 UTC m=+6390.446955064" Dec 01 11:45:44 crc kubenswrapper[4958]: I1201 11:45:44.922429 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-hm846" event={"ID":"c3f18c40-b6d6-48e0-b976-6083ced0d56b","Type":"ContainerStarted","Data":"9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15"} Dec 01 11:45:44 crc kubenswrapper[4958]: I1201 11:45:44.948305 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-hm846" podStartSLOduration=2.102325537 podStartE2EDuration="15.948279719s" podCreationTimestamp="2025-12-01 11:45:29 +0000 UTC" firstStartedPulling="2025-12-01 11:45:30.289514333 +0000 UTC m=+6377.798303360" lastFinishedPulling="2025-12-01 11:45:44.135468505 +0000 UTC m=+6391.644257542" observedRunningTime="2025-12-01 11:45:44.941327672 +0000 UTC m=+6392.450116719" watchObservedRunningTime="2025-12-01 11:45:44.948279719 +0000 UTC m=+6392.457068766" Dec 01 11:45:58 crc kubenswrapper[4958]: I1201 11:45:58.211249 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:45:58 crc kubenswrapper[4958]: I1201 11:45:58.211760 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:45:58 crc kubenswrapper[4958]: I1201 11:45:58.211818 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:45:58 crc kubenswrapper[4958]: I1201 11:45:58.212932 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:45:58 crc kubenswrapper[4958]: I1201 11:45:58.213016 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" gracePeriod=600 Dec 01 11:45:58 crc kubenswrapper[4958]: E1201 11:45:58.342057 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:45:59 crc kubenswrapper[4958]: I1201 11:45:59.077498 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" exitCode=0 Dec 01 11:45:59 crc kubenswrapper[4958]: I1201 11:45:59.077545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28"} Dec 01 11:45:59 crc kubenswrapper[4958]: I1201 11:45:59.077610 4958 scope.go:117] "RemoveContainer" containerID="13cb6cc4ee40481ad64b2a05cd4d173653f521017a8a82d0d6cdc46228d66885" Dec 01 11:45:59 crc kubenswrapper[4958]: I1201 11:45:59.078378 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:45:59 crc kubenswrapper[4958]: E1201 11:45:59.079106 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:45:59 crc kubenswrapper[4958]: I1201 11:45:59.359613 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-bldjw" Dec 01 11:46:07 crc kubenswrapper[4958]: I1201 11:46:07.052066 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-q5vbv"] Dec 01 11:46:07 crc kubenswrapper[4958]: I1201 11:46:07.065533 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-q5vbv"] Dec 01 11:46:07 crc kubenswrapper[4958]: I1201 11:46:07.810690 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35348c1-21cb-436d-92a2-be0b35bae865" path="/var/lib/kubelet/pods/c35348c1-21cb-436d-92a2-be0b35bae865/volumes" Dec 01 11:46:10 crc kubenswrapper[4958]: I1201 11:46:10.380319 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-hm846"] Dec 01 11:46:10 crc kubenswrapper[4958]: I1201 11:46:10.380909 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-hm846" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerName="octavia-amphora-httpd" containerID="cri-o://9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15" gracePeriod=30 Dec 01 11:46:10 crc kubenswrapper[4958]: I1201 11:46:10.798614 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:46:10 crc kubenswrapper[4958]: E1201 11:46:10.799305 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.031624 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.173827 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3f18c40-b6d6-48e0-b976-6083ced0d56b-amphora-image\") pod \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.173903 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f18c40-b6d6-48e0-b976-6083ced0d56b-httpd-config\") pod \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\" (UID: \"c3f18c40-b6d6-48e0-b976-6083ced0d56b\") " Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.206207 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f18c40-b6d6-48e0-b976-6083ced0d56b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c3f18c40-b6d6-48e0-b976-6083ced0d56b" (UID: "c3f18c40-b6d6-48e0-b976-6083ced0d56b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.217920 4958 generic.go:334] "Generic (PLEG): container finished" podID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerID="9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15" exitCode=0 Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.217982 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-hm846" event={"ID":"c3f18c40-b6d6-48e0-b976-6083ced0d56b","Type":"ContainerDied","Data":"9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15"} Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.218018 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-hm846" event={"ID":"c3f18c40-b6d6-48e0-b976-6083ced0d56b","Type":"ContainerDied","Data":"e62b0a3632efb252563ff6eb376115b5529e2bafc5665d04867cf098407323a1"} Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.218046 4958 scope.go:117] "RemoveContainer" containerID="9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.219162 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-hm846" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.249059 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3f18c40-b6d6-48e0-b976-6083ced0d56b-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "c3f18c40-b6d6-48e0-b976-6083ced0d56b" (UID: "c3f18c40-b6d6-48e0-b976-6083ced0d56b"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.280641 4958 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3f18c40-b6d6-48e0-b976-6083ced0d56b-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.280685 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3f18c40-b6d6-48e0-b976-6083ced0d56b-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.350203 4958 scope.go:117] "RemoveContainer" containerID="4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.410450 4958 scope.go:117] "RemoveContainer" containerID="9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15" Dec 01 11:46:11 crc kubenswrapper[4958]: E1201 11:46:11.412212 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15\": container with ID starting with 9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15 not found: ID does not exist" containerID="9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.412260 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15"} err="failed to get container status \"9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15\": rpc error: code = NotFound desc = could not find container \"9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15\": container with ID starting with 9c0d534cf014bbc598b0fe579e2c27c5353df04e4469ce0b79704bbb19bd1a15 not found: ID does not exist" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.412296 4958 scope.go:117] "RemoveContainer" containerID="4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae" Dec 01 11:46:11 crc kubenswrapper[4958]: E1201 11:46:11.412963 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae\": container with ID starting with 4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae not found: ID does not exist" containerID="4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.413002 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae"} err="failed to get container status \"4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae\": rpc error: code = NotFound desc = could not find container \"4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae\": container with ID starting with 4180948f5b15f556ce8109e1236e42ccdcdc8d9b6f2b00c51c74fb18fa1958ae not found: ID does not exist" Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.557458 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-hm846"] Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.567870 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-hm846"] Dec 01 11:46:11 crc kubenswrapper[4958]: I1201 11:46:11.819018 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" path="/var/lib/kubelet/pods/c3f18c40-b6d6-48e0-b976-6083ced0d56b/volumes" Dec 01 11:46:17 crc kubenswrapper[4958]: I1201 11:46:17.040657 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9a62-account-create-sqhzj"] Dec 01 11:46:17 crc kubenswrapper[4958]: I1201 11:46:17.059026 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9a62-account-create-sqhzj"] Dec 01 11:46:17 crc kubenswrapper[4958]: I1201 11:46:17.813646 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bf6ac8-2df1-48b4-a3b7-286728a26999" path="/var/lib/kubelet/pods/96bf6ac8-2df1-48b4-a3b7-286728a26999/volumes" Dec 01 11:46:23 crc kubenswrapper[4958]: I1201 11:46:23.046074 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mbbfx"] Dec 01 11:46:23 crc kubenswrapper[4958]: I1201 11:46:23.056173 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mbbfx"] Dec 01 11:46:23 crc kubenswrapper[4958]: I1201 11:46:23.810030 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:46:23 crc kubenswrapper[4958]: E1201 11:46:23.810322 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:46:23 crc kubenswrapper[4958]: I1201 11:46:23.812380 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65fdf38-96f7-4d0c-939a-46f91892bc94" path="/var/lib/kubelet/pods/e65fdf38-96f7-4d0c-939a-46f91892bc94/volumes" Dec 01 11:46:25 crc kubenswrapper[4958]: I1201 11:46:25.907764 4958 scope.go:117] "RemoveContainer" containerID="1ce18d6c4974ac1fc3ebf61900ee041e32d6b9c9487c5ddc8f3f6cc61d321859" Dec 01 11:46:25 crc kubenswrapper[4958]: I1201 11:46:25.933478 4958 scope.go:117] "RemoveContainer" containerID="e80b6fde790906ce111804d4fba98f011876c0d06431e1cf290d6cbd43c3e248" Dec 01 11:46:25 crc kubenswrapper[4958]: I1201 11:46:25.978035 4958 scope.go:117] "RemoveContainer" containerID="02fe415f1ca925a8048ef4a1fb987d78f51fc8ff77b6c8445f0ae7f44cc517af" Dec 01 11:46:26 crc kubenswrapper[4958]: I1201 11:46:26.018658 4958 scope.go:117] "RemoveContainer" containerID="0508062a0bcde9e5e856144828a1560b634ee7bc468d6ebd1bd2cd020befc620" Dec 01 11:46:26 crc kubenswrapper[4958]: I1201 11:46:26.076950 4958 scope.go:117] "RemoveContainer" containerID="e120ed22e8152e166ae2c2c84da9216ced3fab7951eb96c4b927b69d6ec6a48f" Dec 01 11:46:26 crc kubenswrapper[4958]: I1201 11:46:26.112927 4958 scope.go:117] "RemoveContainer" containerID="47ae2568e6f41867471eeb4e41f5dbb8d85e695c479d2a909abdfb4607f00d8d" Dec 01 11:46:26 crc kubenswrapper[4958]: I1201 11:46:26.138950 4958 scope.go:117] "RemoveContainer" containerID="3857b8805d12d606d1985dc4bf5d26ed238125b19ad706c5c17738db92615797" Dec 01 11:46:26 crc kubenswrapper[4958]: I1201 11:46:26.180149 4958 scope.go:117] "RemoveContainer" containerID="e7507cc7ac70613ec3b4941240fd4cbfd2dac545b3b1c88acf38b294732666ed" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.474976 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-dm5mf"] Dec 01 11:46:33 crc kubenswrapper[4958]: E1201 11:46:33.476071 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerName="init" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.476089 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerName="init" Dec 01 11:46:33 crc kubenswrapper[4958]: E1201 11:46:33.476155 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerName="init" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.476165 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerName="init" Dec 01 11:46:33 crc kubenswrapper[4958]: E1201 11:46:33.476190 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerName="octavia-amphora-httpd" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.476201 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerName="octavia-amphora-httpd" Dec 01 11:46:33 crc kubenswrapper[4958]: E1201 11:46:33.476221 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerName="octavia-db-sync" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.476229 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerName="octavia-db-sync" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.476505 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3f18c40-b6d6-48e0-b976-6083ced0d56b" containerName="octavia-amphora-httpd" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.476524 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b2c2cb-9590-403a-9833-67d909640ca6" containerName="octavia-db-sync" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.478015 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.480735 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.481115 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.481276 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.486943 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-dm5mf"] Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.604122 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6f968b15-117f-4035-99ce-d60cb09d1221-hm-ports\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.604208 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6f968b15-117f-4035-99ce-d60cb09d1221-config-data-merged\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.604234 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-amphora-certs\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.604294 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-scripts\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.604374 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-combined-ca-bundle\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.604421 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-config-data\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.707133 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6f968b15-117f-4035-99ce-d60cb09d1221-config-data-merged\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.707494 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-amphora-certs\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.707675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-scripts\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.707966 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6f968b15-117f-4035-99ce-d60cb09d1221-config-data-merged\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.709326 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-combined-ca-bundle\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.709529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-config-data\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.709783 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6f968b15-117f-4035-99ce-d60cb09d1221-hm-ports\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.710829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6f968b15-117f-4035-99ce-d60cb09d1221-hm-ports\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.716634 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-amphora-certs\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.716657 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-scripts\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.718448 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-combined-ca-bundle\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.729166 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f968b15-117f-4035-99ce-d60cb09d1221-config-data\") pod \"octavia-healthmanager-dm5mf\" (UID: \"6f968b15-117f-4035-99ce-d60cb09d1221\") " pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:33 crc kubenswrapper[4958]: I1201 11:46:33.831082 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.541192 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-dm5mf"] Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.822932 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-ppg8n"] Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.825136 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.828610 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.828708 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.834516 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-ppg8n"] Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.954454 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/21133df5-8822-4d92-9f6f-a22b455f5288-hm-ports\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.954525 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-amphora-certs\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.954597 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-combined-ca-bundle\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.955464 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21133df5-8822-4d92-9f6f-a22b455f5288-config-data-merged\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.955729 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-config-data\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:34 crc kubenswrapper[4958]: I1201 11:46:34.955897 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-scripts\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.058563 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-combined-ca-bundle\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.058622 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21133df5-8822-4d92-9f6f-a22b455f5288-config-data-merged\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.058669 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-config-data\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.058698 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-scripts\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.058783 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/21133df5-8822-4d92-9f6f-a22b455f5288-hm-ports\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.058814 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-amphora-certs\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.061641 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/21133df5-8822-4d92-9f6f-a22b455f5288-config-data-merged\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.062903 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/21133df5-8822-4d92-9f6f-a22b455f5288-hm-ports\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.066057 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-combined-ca-bundle\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.066405 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-amphora-certs\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.066801 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-config-data\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.082453 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21133df5-8822-4d92-9f6f-a22b455f5288-scripts\") pod \"octavia-housekeeping-ppg8n\" (UID: \"21133df5-8822-4d92-9f6f-a22b455f5288\") " pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.153022 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.547788 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-dm5mf" event={"ID":"6f968b15-117f-4035-99ce-d60cb09d1221","Type":"ContainerStarted","Data":"ec8442ab013ea04a34beed721ba7829b197734d71f4b7e0eb000be6afc14a4bc"} Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.548155 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-dm5mf" event={"ID":"6f968b15-117f-4035-99ce-d60cb09d1221","Type":"ContainerStarted","Data":"2d9929f4ef5b8dfb60fc28b90a6779f64399fca8d2b71216ceb2bd4d59915d00"} Dec 01 11:46:35 crc kubenswrapper[4958]: I1201 11:46:35.824200 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-ppg8n"] Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.572536 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ppg8n" event={"ID":"21133df5-8822-4d92-9f6f-a22b455f5288","Type":"ContainerStarted","Data":"73ca4b0b427bfd3cdce1c4a4a2790300be9b4da567b8167db2d9e88a0c3917cd"} Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.790156 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-qcdqx"] Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.793522 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.796238 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.796250 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.811913 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-qcdqx"] Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.880331 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-config-data\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.880505 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-amphora-certs\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.880554 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/502d038a-83a5-42c6-b031-ec7f8f29de9d-config-data-merged\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.880942 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-combined-ca-bundle\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.881008 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/502d038a-83a5-42c6-b031-ec7f8f29de9d-hm-ports\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:36 crc kubenswrapper[4958]: I1201 11:46:36.881075 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-scripts\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.042248 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-amphora-certs\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.042310 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/502d038a-83a5-42c6-b031-ec7f8f29de9d-config-data-merged\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.042389 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-combined-ca-bundle\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.042413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/502d038a-83a5-42c6-b031-ec7f8f29de9d-hm-ports\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.042442 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-scripts\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.042465 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-config-data\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.043463 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/502d038a-83a5-42c6-b031-ec7f8f29de9d-config-data-merged\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.044682 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/502d038a-83a5-42c6-b031-ec7f8f29de9d-hm-ports\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.049819 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-scripts\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.054446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-config-data\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.058154 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-combined-ca-bundle\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.058473 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/502d038a-83a5-42c6-b031-ec7f8f29de9d-amphora-certs\") pod \"octavia-worker-qcdqx\" (UID: \"502d038a-83a5-42c6-b031-ec7f8f29de9d\") " pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.119114 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.617467 4958 generic.go:334] "Generic (PLEG): container finished" podID="6f968b15-117f-4035-99ce-d60cb09d1221" containerID="ec8442ab013ea04a34beed721ba7829b197734d71f4b7e0eb000be6afc14a4bc" exitCode=0 Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.617562 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-dm5mf" event={"ID":"6f968b15-117f-4035-99ce-d60cb09d1221","Type":"ContainerDied","Data":"ec8442ab013ea04a34beed721ba7829b197734d71f4b7e0eb000be6afc14a4bc"} Dec 01 11:46:37 crc kubenswrapper[4958]: I1201 11:46:37.724043 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-qcdqx"] Dec 01 11:46:38 crc kubenswrapper[4958]: I1201 11:46:38.628298 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ppg8n" event={"ID":"21133df5-8822-4d92-9f6f-a22b455f5288","Type":"ContainerStarted","Data":"123db47db3b07e3e6b8f25162e5ae729ab67b9671510c347376fc07e12a364b8"} Dec 01 11:46:38 crc kubenswrapper[4958]: I1201 11:46:38.631157 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-dm5mf" event={"ID":"6f968b15-117f-4035-99ce-d60cb09d1221","Type":"ContainerStarted","Data":"348a7f18c2a0c3c7fd48c6fdb923188b0052699b597444c079bf1b09a300e388"} Dec 01 11:46:38 crc kubenswrapper[4958]: I1201 11:46:38.631396 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:38 crc kubenswrapper[4958]: I1201 11:46:38.633309 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qcdqx" event={"ID":"502d038a-83a5-42c6-b031-ec7f8f29de9d","Type":"ContainerStarted","Data":"677ae81de1f879265aa218fdafb2181a83a80395af17581a6ca043086af6215b"} Dec 01 11:46:38 crc kubenswrapper[4958]: I1201 11:46:38.678968 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-dm5mf" podStartSLOduration=5.678941873 podStartE2EDuration="5.678941873s" podCreationTimestamp="2025-12-01 11:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:46:38.672258463 +0000 UTC m=+6446.181047520" watchObservedRunningTime="2025-12-01 11:46:38.678941873 +0000 UTC m=+6446.187730910" Dec 01 11:46:38 crc kubenswrapper[4958]: I1201 11:46:38.798234 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:46:38 crc kubenswrapper[4958]: E1201 11:46:38.798596 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:46:39 crc kubenswrapper[4958]: I1201 11:46:39.645152 4958 generic.go:334] "Generic (PLEG): container finished" podID="21133df5-8822-4d92-9f6f-a22b455f5288" containerID="123db47db3b07e3e6b8f25162e5ae729ab67b9671510c347376fc07e12a364b8" exitCode=0 Dec 01 11:46:39 crc kubenswrapper[4958]: I1201 11:46:39.646826 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ppg8n" event={"ID":"21133df5-8822-4d92-9f6f-a22b455f5288","Type":"ContainerDied","Data":"123db47db3b07e3e6b8f25162e5ae729ab67b9671510c347376fc07e12a364b8"} Dec 01 11:46:40 crc kubenswrapper[4958]: I1201 11:46:40.657136 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qcdqx" event={"ID":"502d038a-83a5-42c6-b031-ec7f8f29de9d","Type":"ContainerStarted","Data":"d412706f64831de6b49e34e4ae1dfc0d560845c667a72f5f7e299ce44a8f8357"} Dec 01 11:46:40 crc kubenswrapper[4958]: I1201 11:46:40.662300 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-ppg8n" event={"ID":"21133df5-8822-4d92-9f6f-a22b455f5288","Type":"ContainerStarted","Data":"e9b0da89cf2f5a9f11b0fb41bb74da56bb80a0923e7c507e681df1b2d4321868"} Dec 01 11:46:40 crc kubenswrapper[4958]: I1201 11:46:40.662660 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:40 crc kubenswrapper[4958]: I1201 11:46:40.707565 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-ppg8n" podStartSLOduration=4.70910165 podStartE2EDuration="6.707544121s" podCreationTimestamp="2025-12-01 11:46:34 +0000 UTC" firstStartedPulling="2025-12-01 11:46:35.852005323 +0000 UTC m=+6443.360794360" lastFinishedPulling="2025-12-01 11:46:37.850447794 +0000 UTC m=+6445.359236831" observedRunningTime="2025-12-01 11:46:40.700523971 +0000 UTC m=+6448.209313018" watchObservedRunningTime="2025-12-01 11:46:40.707544121 +0000 UTC m=+6448.216333158" Dec 01 11:46:41 crc kubenswrapper[4958]: I1201 11:46:41.681956 4958 generic.go:334] "Generic (PLEG): container finished" podID="502d038a-83a5-42c6-b031-ec7f8f29de9d" containerID="d412706f64831de6b49e34e4ae1dfc0d560845c667a72f5f7e299ce44a8f8357" exitCode=0 Dec 01 11:46:41 crc kubenswrapper[4958]: I1201 11:46:41.682366 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qcdqx" event={"ID":"502d038a-83a5-42c6-b031-ec7f8f29de9d","Type":"ContainerDied","Data":"d412706f64831de6b49e34e4ae1dfc0d560845c667a72f5f7e299ce44a8f8357"} Dec 01 11:46:42 crc kubenswrapper[4958]: I1201 11:46:42.697891 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qcdqx" event={"ID":"502d038a-83a5-42c6-b031-ec7f8f29de9d","Type":"ContainerStarted","Data":"ce7ef19a2746f930ffc0a431c2dffd1429596e73035dd8c9750cde2fe2cea6e0"} Dec 01 11:46:42 crc kubenswrapper[4958]: I1201 11:46:42.726820 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-qcdqx" podStartSLOduration=5.259969033 podStartE2EDuration="6.726791742s" podCreationTimestamp="2025-12-01 11:46:36 +0000 UTC" firstStartedPulling="2025-12-01 11:46:37.845116663 +0000 UTC m=+6445.353905700" lastFinishedPulling="2025-12-01 11:46:39.311939372 +0000 UTC m=+6446.820728409" observedRunningTime="2025-12-01 11:46:42.719190616 +0000 UTC m=+6450.227979653" watchObservedRunningTime="2025-12-01 11:46:42.726791742 +0000 UTC m=+6450.235580779" Dec 01 11:46:43 crc kubenswrapper[4958]: I1201 11:46:43.708890 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:48 crc kubenswrapper[4958]: I1201 11:46:48.869347 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-dm5mf" Dec 01 11:46:50 crc kubenswrapper[4958]: I1201 11:46:50.195599 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-ppg8n" Dec 01 11:46:51 crc kubenswrapper[4958]: I1201 11:46:51.808700 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:46:51 crc kubenswrapper[4958]: E1201 11:46:51.809812 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:46:52 crc kubenswrapper[4958]: I1201 11:46:52.042599 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nrgc9"] Dec 01 11:46:52 crc kubenswrapper[4958]: I1201 11:46:52.053121 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nrgc9"] Dec 01 11:46:52 crc kubenswrapper[4958]: I1201 11:46:52.156215 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-qcdqx" Dec 01 11:46:53 crc kubenswrapper[4958]: I1201 11:46:53.811297 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0a2476-9374-4bbb-aba1-d6861a36cc56" path="/var/lib/kubelet/pods/9a0a2476-9374-4bbb-aba1-d6861a36cc56/volumes" Dec 01 11:47:02 crc kubenswrapper[4958]: I1201 11:47:02.039423 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5efc-account-create-bxbx6"] Dec 01 11:47:02 crc kubenswrapper[4958]: I1201 11:47:02.053915 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5efc-account-create-bxbx6"] Dec 01 11:47:03 crc kubenswrapper[4958]: I1201 11:47:03.813541 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:47:03 crc kubenswrapper[4958]: E1201 11:47:03.814159 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:47:03 crc kubenswrapper[4958]: I1201 11:47:03.820344 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f7171b-238f-414a-9ba5-162eaf91718a" path="/var/lib/kubelet/pods/13f7171b-238f-414a-9ba5-162eaf91718a/volumes" Dec 01 11:47:11 crc kubenswrapper[4958]: I1201 11:47:11.046540 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fnwgw"] Dec 01 11:47:11 crc kubenswrapper[4958]: I1201 11:47:11.063216 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fnwgw"] Dec 01 11:47:11 crc kubenswrapper[4958]: I1201 11:47:11.811085 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="207b3a13-aef6-4b53-9856-773957da8b9f" path="/var/lib/kubelet/pods/207b3a13-aef6-4b53-9856-773957da8b9f/volumes" Dec 01 11:47:14 crc kubenswrapper[4958]: I1201 11:47:14.798704 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:47:14 crc kubenswrapper[4958]: E1201 11:47:14.799897 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:47:26 crc kubenswrapper[4958]: I1201 11:47:26.311334 4958 scope.go:117] "RemoveContainer" containerID="5f248fdbef07e9f0ce51ead3f8d3b21ca110a971f4e8e967c03f88c55d641c41" Dec 01 11:47:26 crc kubenswrapper[4958]: I1201 11:47:26.379018 4958 scope.go:117] "RemoveContainer" containerID="620b27c1173d320928855d927b9cf9b145577183ba351a15a60e7bfbb0578c05" Dec 01 11:47:26 crc kubenswrapper[4958]: I1201 11:47:26.412390 4958 scope.go:117] "RemoveContainer" containerID="5c11d9bb17982367b09aa71a4b0c77616aead9aa990179649735d4e11a07cc74" Dec 01 11:47:27 crc kubenswrapper[4958]: I1201 11:47:27.797330 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:47:27 crc kubenswrapper[4958]: E1201 11:47:27.798019 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.373306 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86c5b688c5-f62bb"] Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.375634 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.379388 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.379581 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-85f7p" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.379698 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.379802 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.407495 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86c5b688c5-f62bb"] Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.408820 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e3aa33-cc8c-46db-9d82-cb36fe966a99-logs\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.409239 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxs65\" (UniqueName: \"kubernetes.io/projected/76e3aa33-cc8c-46db-9d82-cb36fe966a99-kube-api-access-hxs65\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.409444 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-scripts\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.414125 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-config-data\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.414831 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e3aa33-cc8c-46db-9d82-cb36fe966a99-horizon-secret-key\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.516515 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e3aa33-cc8c-46db-9d82-cb36fe966a99-horizon-secret-key\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.516577 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e3aa33-cc8c-46db-9d82-cb36fe966a99-logs\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.516628 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxs65\" (UniqueName: \"kubernetes.io/projected/76e3aa33-cc8c-46db-9d82-cb36fe966a99-kube-api-access-hxs65\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.516691 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-scripts\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.516706 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-config-data\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.517422 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.517696 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-log" containerID="cri-o://cbebfdae0b767fbdf2b8955ba768a720154f73d1ca704c4af17e9caebd607a3d" gracePeriod=30 Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.518054 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-httpd" containerID="cri-o://600ed33174aa32b2178325e633f1d913824ebccf2419cde4691c89758c3c6f3f" gracePeriod=30 Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.518107 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-config-data\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.519074 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e3aa33-cc8c-46db-9d82-cb36fe966a99-logs\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.520054 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-scripts\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.557210 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e3aa33-cc8c-46db-9d82-cb36fe966a99-horizon-secret-key\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.567680 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxs65\" (UniqueName: \"kubernetes.io/projected/76e3aa33-cc8c-46db-9d82-cb36fe966a99-kube-api-access-hxs65\") pod \"horizon-86c5b688c5-f62bb\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.689972 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.690353 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-log" containerID="cri-o://efb05262b5b46e12a167da1ac18dcff088498492ba3a10622cd142ecf97d29db" gracePeriod=30 Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.691193 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-httpd" containerID="cri-o://f5d3d24e1bd96f4f58364299107b6a03f47077a72296b98f20be0f9b504b4317" gracePeriod=30 Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.733398 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.758773 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69dfb9ff85-g2xdt"] Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.760535 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.779106 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69dfb9ff85-g2xdt"] Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.826392 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-scripts\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.826547 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a83948e-ec95-47a6-bdf9-c73444293d67-horizon-secret-key\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.826642 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprt7\" (UniqueName: \"kubernetes.io/projected/3a83948e-ec95-47a6-bdf9-c73444293d67-kube-api-access-jprt7\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.826705 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-config-data\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.826737 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a83948e-ec95-47a6-bdf9-c73444293d67-logs\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.934694 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-scripts\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.935218 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a83948e-ec95-47a6-bdf9-c73444293d67-horizon-secret-key\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.935307 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprt7\" (UniqueName: \"kubernetes.io/projected/3a83948e-ec95-47a6-bdf9-c73444293d67-kube-api-access-jprt7\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.935355 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-config-data\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.935394 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a83948e-ec95-47a6-bdf9-c73444293d67-logs\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.936891 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a83948e-ec95-47a6-bdf9-c73444293d67-logs\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.936979 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-scripts\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.937516 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-config-data\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.945441 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a83948e-ec95-47a6-bdf9-c73444293d67-horizon-secret-key\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:38 crc kubenswrapper[4958]: I1201 11:47:38.965695 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprt7\" (UniqueName: \"kubernetes.io/projected/3a83948e-ec95-47a6-bdf9-c73444293d67-kube-api-access-jprt7\") pod \"horizon-69dfb9ff85-g2xdt\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.098503 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.241638 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69dfb9ff85-g2xdt"] Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.274947 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-986f959c9-6gffd"] Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.276499 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.302173 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-986f959c9-6gffd"] Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.340873 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86c5b688c5-f62bb"] Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.354106 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-config-data\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.354176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-scripts\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.354231 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhd6z\" (UniqueName: \"kubernetes.io/projected/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-kube-api-access-lhd6z\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.354256 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-horizon-secret-key\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.354420 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-logs\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.439074 4958 generic.go:334] "Generic (PLEG): container finished" podID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerID="cbebfdae0b767fbdf2b8955ba768a720154f73d1ca704c4af17e9caebd607a3d" exitCode=143 Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.439471 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a","Type":"ContainerDied","Data":"cbebfdae0b767fbdf2b8955ba768a720154f73d1ca704c4af17e9caebd607a3d"} Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.443941 4958 generic.go:334] "Generic (PLEG): container finished" podID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerID="efb05262b5b46e12a167da1ac18dcff088498492ba3a10622cd142ecf97d29db" exitCode=143 Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.444015 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"735d94c9-1a28-41d5-ad7b-ce9d247bdff7","Type":"ContainerDied","Data":"efb05262b5b46e12a167da1ac18dcff088498492ba3a10622cd142ecf97d29db"} Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.446253 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c5b688c5-f62bb" event={"ID":"76e3aa33-cc8c-46db-9d82-cb36fe966a99","Type":"ContainerStarted","Data":"4e996753158e55d8d4f502494e2db53c005179f2c3cceeceecc5ae94fafcc6e7"} Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.456590 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-config-data\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.456661 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-scripts\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.456695 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhd6z\" (UniqueName: \"kubernetes.io/projected/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-kube-api-access-lhd6z\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.456732 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-horizon-secret-key\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.456766 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-logs\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:39 crc kubenswrapper[4958]: I1201 11:47:39.458446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-scripts\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.459080 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-logs\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.459986 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-config-data\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.476147 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-horizon-secret-key\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.480250 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhd6z\" (UniqueName: \"kubernetes.io/projected/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-kube-api-access-lhd6z\") pod \"horizon-986f959c9-6gffd\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.617224 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.753505 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69dfb9ff85-g2xdt"] Dec 01 11:47:40 crc kubenswrapper[4958]: W1201 11:47:39.763762 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a83948e_ec95_47a6_bdf9_c73444293d67.slice/crio-cdcefcd66a7750c2fc83bb75e131400d5ca6b77ec1446ba2cc072492fd9d92c0 WatchSource:0}: Error finding container cdcefcd66a7750c2fc83bb75e131400d5ca6b77ec1446ba2cc072492fd9d92c0: Status 404 returned error can't find the container with id cdcefcd66a7750c2fc83bb75e131400d5ca6b77ec1446ba2cc072492fd9d92c0 Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:39.798236 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:47:40 crc kubenswrapper[4958]: E1201 11:47:39.798544 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:40.461603 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69dfb9ff85-g2xdt" event={"ID":"3a83948e-ec95-47a6-bdf9-c73444293d67","Type":"ContainerStarted","Data":"cdcefcd66a7750c2fc83bb75e131400d5ca6b77ec1446ba2cc072492fd9d92c0"} Dec 01 11:47:40 crc kubenswrapper[4958]: I1201 11:47:40.568146 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-986f959c9-6gffd"] Dec 01 11:47:40 crc kubenswrapper[4958]: W1201 11:47:40.572926 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf3d5bd9_1a39_4fc3_b4ff_b27cdfb968b6.slice/crio-8c6c73b5cd95b8a3ee29ad7bd6e9daab1a304514fdf9924d01225f41a9d174cc WatchSource:0}: Error finding container 8c6c73b5cd95b8a3ee29ad7bd6e9daab1a304514fdf9924d01225f41a9d174cc: Status 404 returned error can't find the container with id 8c6c73b5cd95b8a3ee29ad7bd6e9daab1a304514fdf9924d01225f41a9d174cc Dec 01 11:47:41 crc kubenswrapper[4958]: I1201 11:47:41.472197 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-986f959c9-6gffd" event={"ID":"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6","Type":"ContainerStarted","Data":"8c6c73b5cd95b8a3ee29ad7bd6e9daab1a304514fdf9924d01225f41a9d174cc"} Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.503996 4958 generic.go:334] "Generic (PLEG): container finished" podID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerID="600ed33174aa32b2178325e633f1d913824ebccf2419cde4691c89758c3c6f3f" exitCode=0 Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.504047 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a","Type":"ContainerDied","Data":"600ed33174aa32b2178325e633f1d913824ebccf2419cde4691c89758c3c6f3f"} Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.506523 4958 generic.go:334] "Generic (PLEG): container finished" podID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerID="f5d3d24e1bd96f4f58364299107b6a03f47077a72296b98f20be0f9b504b4317" exitCode=0 Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.506545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"735d94c9-1a28-41d5-ad7b-ce9d247bdff7","Type":"ContainerDied","Data":"f5d3d24e1bd96f4f58364299107b6a03f47077a72296b98f20be0f9b504b4317"} Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.753708 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844145 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-combined-ca-bundle\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844204 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-ceph\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844247 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxzf\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-kube-api-access-kpxzf\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844342 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-httpd-run\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844418 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-logs\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844489 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-config-data\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.844532 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-scripts\") pod \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\" (UID: \"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a\") " Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.845767 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.846639 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-logs" (OuterVolumeSpecName: "logs") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.852929 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-scripts" (OuterVolumeSpecName: "scripts") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.854017 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-ceph" (OuterVolumeSpecName: "ceph") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.872171 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-kube-api-access-kpxzf" (OuterVolumeSpecName: "kube-api-access-kpxzf") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "kube-api-access-kpxzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.883547 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.920291 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-config-data" (OuterVolumeSpecName: "config-data") pod "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" (UID: "2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.947264 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.947303 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.947354 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.947371 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.947381 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxzf\" (UniqueName: \"kubernetes.io/projected/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-kube-api-access-kpxzf\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.947390 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:42 crc kubenswrapper[4958]: I1201 11:47:42.948575 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.525114 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a","Type":"ContainerDied","Data":"0f24379cb866c41a59c9b03209dfdaffc7ba10779adb58e412a75fdb170ee1da"} Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.525171 4958 scope.go:117] "RemoveContainer" containerID="600ed33174aa32b2178325e633f1d913824ebccf2419cde4691c89758c3c6f3f" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.525377 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.576011 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.589966 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.597073 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:47:43 crc kubenswrapper[4958]: E1201 11:47:43.597737 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-log" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.597783 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-log" Dec 01 11:47:43 crc kubenswrapper[4958]: E1201 11:47:43.597871 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-httpd" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.597879 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-httpd" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.598343 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-httpd" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.598368 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" containerName="glance-log" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.600711 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.605563 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.612008 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.671879 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.671964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kqq\" (UniqueName: \"kubernetes.io/projected/fe8e1679-6c0a-43a4-9414-8232112e7612-kube-api-access-68kqq\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.672046 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.672091 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.672124 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8e1679-6c0a-43a4-9414-8232112e7612-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.672150 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fe8e1679-6c0a-43a4-9414-8232112e7612-ceph\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.672352 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8e1679-6c0a-43a4-9414-8232112e7612-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.774639 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8e1679-6c0a-43a4-9414-8232112e7612-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775012 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775052 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kqq\" (UniqueName: \"kubernetes.io/projected/fe8e1679-6c0a-43a4-9414-8232112e7612-kube-api-access-68kqq\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775104 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775124 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775143 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8e1679-6c0a-43a4-9414-8232112e7612-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775162 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fe8e1679-6c0a-43a4-9414-8232112e7612-ceph\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.775357 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe8e1679-6c0a-43a4-9414-8232112e7612-logs\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.777624 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe8e1679-6c0a-43a4-9414-8232112e7612-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.781740 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-scripts\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.782793 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-config-data\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.783433 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fe8e1679-6c0a-43a4-9414-8232112e7612-ceph\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.783471 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe8e1679-6c0a-43a4-9414-8232112e7612-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.814364 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kqq\" (UniqueName: \"kubernetes.io/projected/fe8e1679-6c0a-43a4-9414-8232112e7612-kube-api-access-68kqq\") pod \"glance-default-external-api-0\" (UID: \"fe8e1679-6c0a-43a4-9414-8232112e7612\") " pod="openstack/glance-default-external-api-0" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.831294 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a" path="/var/lib/kubelet/pods/2adf7af4-7ca2-4af5-92d1-1a3fbd2b6b4a/volumes" Dec 01 11:47:43 crc kubenswrapper[4958]: I1201 11:47:43.922540 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.586904 4958 scope.go:117] "RemoveContainer" containerID="cbebfdae0b767fbdf2b8955ba768a720154f73d1ca704c4af17e9caebd607a3d" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.771995 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.858883 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb26s\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-kube-api-access-gb26s\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.858947 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-ceph\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.858990 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-scripts\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.859027 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-httpd-run\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.859095 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-logs\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.859123 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-combined-ca-bundle\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.859637 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.861057 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-config-data\") pod \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\" (UID: \"735d94c9-1a28-41d5-ad7b-ce9d247bdff7\") " Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.861765 4958 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.863692 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-logs" (OuterVolumeSpecName: "logs") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.872872 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-ceph" (OuterVolumeSpecName: "ceph") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.872959 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-kube-api-access-gb26s" (OuterVolumeSpecName: "kube-api-access-gb26s") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "kube-api-access-gb26s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.886666 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-scripts" (OuterVolumeSpecName: "scripts") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.936246 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.963725 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.963757 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.963769 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.963783 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb26s\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-kube-api-access-gb26s\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.963796 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:48 crc kubenswrapper[4958]: I1201 11:47:48.986057 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-config-data" (OuterVolumeSpecName: "config-data") pod "735d94c9-1a28-41d5-ad7b-ce9d247bdff7" (UID: "735d94c9-1a28-41d5-ad7b-ce9d247bdff7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.066823 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/735d94c9-1a28-41d5-ad7b-ce9d247bdff7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.341858 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.595246 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c5b688c5-f62bb" event={"ID":"76e3aa33-cc8c-46db-9d82-cb36fe966a99","Type":"ContainerStarted","Data":"713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.595291 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c5b688c5-f62bb" event={"ID":"76e3aa33-cc8c-46db-9d82-cb36fe966a99","Type":"ContainerStarted","Data":"04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.599632 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.599940 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"735d94c9-1a28-41d5-ad7b-ce9d247bdff7","Type":"ContainerDied","Data":"57b7720b4b12028fa168d7cd91fdee6ee10528a389e585d003e6fc7872fdc9e6"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.600006 4958 scope.go:117] "RemoveContainer" containerID="f5d3d24e1bd96f4f58364299107b6a03f47077a72296b98f20be0f9b504b4317" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.603742 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69dfb9ff85-g2xdt" event={"ID":"3a83948e-ec95-47a6-bdf9-c73444293d67","Type":"ContainerStarted","Data":"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.603775 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69dfb9ff85-g2xdt" event={"ID":"3a83948e-ec95-47a6-bdf9-c73444293d67","Type":"ContainerStarted","Data":"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.603834 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69dfb9ff85-g2xdt" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon-log" containerID="cri-o://7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441" gracePeriod=30 Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.603951 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-69dfb9ff85-g2xdt" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon" containerID="cri-o://7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b" gracePeriod=30 Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.606646 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-986f959c9-6gffd" event={"ID":"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6","Type":"ContainerStarted","Data":"3c1e412f7dbc976857fa1fcc765688ccfafe240da6ecd120429f9b1933a6a5d6"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.606671 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-986f959c9-6gffd" event={"ID":"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6","Type":"ContainerStarted","Data":"f8487340023bd36ca8fa7628550caf59b7e08a558adb39f3a52beeecc6323087"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.614735 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8e1679-6c0a-43a4-9414-8232112e7612","Type":"ContainerStarted","Data":"1852da531ce3ffe3b6799621a5465f7d49ab513e88055b2ee11c7ab59566a81b"} Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.618991 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.619081 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.628047 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86c5b688c5-f62bb" podStartSLOduration=2.283515106 podStartE2EDuration="11.628025546s" podCreationTimestamp="2025-12-01 11:47:38 +0000 UTC" firstStartedPulling="2025-12-01 11:47:39.362939884 +0000 UTC m=+6506.871728921" lastFinishedPulling="2025-12-01 11:47:48.707450324 +0000 UTC m=+6516.216239361" observedRunningTime="2025-12-01 11:47:49.615340366 +0000 UTC m=+6517.124129403" watchObservedRunningTime="2025-12-01 11:47:49.628025546 +0000 UTC m=+6517.136814573" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.648787 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-986f959c9-6gffd" podStartSLOduration=2.529571109 podStartE2EDuration="10.648764415s" podCreationTimestamp="2025-12-01 11:47:39 +0000 UTC" firstStartedPulling="2025-12-01 11:47:40.584413399 +0000 UTC m=+6508.093202436" lastFinishedPulling="2025-12-01 11:47:48.703606705 +0000 UTC m=+6516.212395742" observedRunningTime="2025-12-01 11:47:49.644109852 +0000 UTC m=+6517.152898889" watchObservedRunningTime="2025-12-01 11:47:49.648764415 +0000 UTC m=+6517.157553452" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.657818 4958 scope.go:117] "RemoveContainer" containerID="efb05262b5b46e12a167da1ac18dcff088498492ba3a10622cd142ecf97d29db" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.671277 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-69dfb9ff85-g2xdt" podStartSLOduration=2.731629987 podStartE2EDuration="11.671257593s" podCreationTimestamp="2025-12-01 11:47:38 +0000 UTC" firstStartedPulling="2025-12-01 11:47:39.775517446 +0000 UTC m=+6507.284306483" lastFinishedPulling="2025-12-01 11:47:48.715145052 +0000 UTC m=+6516.223934089" observedRunningTime="2025-12-01 11:47:49.668157675 +0000 UTC m=+6517.176946712" watchObservedRunningTime="2025-12-01 11:47:49.671257593 +0000 UTC m=+6517.180046630" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.730381 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.741463 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.756835 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:47:49 crc kubenswrapper[4958]: E1201 11:47:49.757410 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-log" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.757426 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-log" Dec 01 11:47:49 crc kubenswrapper[4958]: E1201 11:47:49.757469 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-httpd" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.757477 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-httpd" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.757670 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-log" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.757695 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-httpd" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.758961 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.759059 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.765087 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.779440 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.779545 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3092940e-3b67-4013-834a-dcdb2865678f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.779566 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3092940e-3b67-4013-834a-dcdb2865678f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.779657 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftjl7\" (UniqueName: \"kubernetes.io/projected/3092940e-3b67-4013-834a-dcdb2865678f-kube-api-access-ftjl7\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.779681 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.781124 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.781232 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3092940e-3b67-4013-834a-dcdb2865678f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.818402 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" path="/var/lib/kubelet/pods/735d94c9-1a28-41d5-ad7b-ce9d247bdff7/volumes" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.883326 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftjl7\" (UniqueName: \"kubernetes.io/projected/3092940e-3b67-4013-834a-dcdb2865678f-kube-api-access-ftjl7\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.883787 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.884560 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.884610 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3092940e-3b67-4013-834a-dcdb2865678f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.884649 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.884800 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3092940e-3b67-4013-834a-dcdb2865678f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.884824 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3092940e-3b67-4013-834a-dcdb2865678f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.885243 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3092940e-3b67-4013-834a-dcdb2865678f-logs\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.888272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3092940e-3b67-4013-834a-dcdb2865678f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.889586 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.890038 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.890549 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3092940e-3b67-4013-834a-dcdb2865678f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.902248 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftjl7\" (UniqueName: \"kubernetes.io/projected/3092940e-3b67-4013-834a-dcdb2865678f-kube-api-access-ftjl7\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:49 crc kubenswrapper[4958]: I1201 11:47:49.908236 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3092940e-3b67-4013-834a-dcdb2865678f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3092940e-3b67-4013-834a-dcdb2865678f\") " pod="openstack/glance-default-internal-api-0" Dec 01 11:47:50 crc kubenswrapper[4958]: I1201 11:47:50.084538 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 11:47:50 crc kubenswrapper[4958]: I1201 11:47:50.734743 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8e1679-6c0a-43a4-9414-8232112e7612","Type":"ContainerStarted","Data":"c8bd4242d8f5755d1a0b8da35b2d35addba0fe0f03ff60f99b6d9e0c5c380fa6"} Dec 01 11:47:50 crc kubenswrapper[4958]: I1201 11:47:50.793040 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 11:47:50 crc kubenswrapper[4958]: W1201 11:47:50.824808 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3092940e_3b67_4013_834a_dcdb2865678f.slice/crio-6f49425e71276aa8786a6e6ec1681f21e260bb703ab37d6196662c1b403e537c WatchSource:0}: Error finding container 6f49425e71276aa8786a6e6ec1681f21e260bb703ab37d6196662c1b403e537c: Status 404 returned error can't find the container with id 6f49425e71276aa8786a6e6ec1681f21e260bb703ab37d6196662c1b403e537c Dec 01 11:47:51 crc kubenswrapper[4958]: I1201 11:47:51.759114 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fe8e1679-6c0a-43a4-9414-8232112e7612","Type":"ContainerStarted","Data":"2e082b99213c02e6e04379afd49cbf49e8f783397492c85d2a789eaa7cd9799d"} Dec 01 11:47:51 crc kubenswrapper[4958]: I1201 11:47:51.780572 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3092940e-3b67-4013-834a-dcdb2865678f","Type":"ContainerStarted","Data":"ab8a6f0a47ef419d6f0a64b33c83128bef575f4c014e75c159cd3e5d0b22af3d"} Dec 01 11:47:51 crc kubenswrapper[4958]: I1201 11:47:51.780612 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3092940e-3b67-4013-834a-dcdb2865678f","Type":"ContainerStarted","Data":"6f49425e71276aa8786a6e6ec1681f21e260bb703ab37d6196662c1b403e537c"} Dec 01 11:47:51 crc kubenswrapper[4958]: I1201 11:47:51.799471 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:47:51 crc kubenswrapper[4958]: E1201 11:47:51.799879 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:47:51 crc kubenswrapper[4958]: I1201 11:47:51.812427 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.812406936 podStartE2EDuration="8.812406936s" podCreationTimestamp="2025-12-01 11:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:47:51.795784944 +0000 UTC m=+6519.304573991" watchObservedRunningTime="2025-12-01 11:47:51.812406936 +0000 UTC m=+6519.321195973" Dec 01 11:47:52 crc kubenswrapper[4958]: I1201 11:47:52.796487 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3092940e-3b67-4013-834a-dcdb2865678f","Type":"ContainerStarted","Data":"798d2d05c16d13baf937870b83ae506072d22e533529b94eea25b98d578ec56f"} Dec 01 11:47:52 crc kubenswrapper[4958]: I1201 11:47:52.841226 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.84120165 podStartE2EDuration="3.84120165s" podCreationTimestamp="2025-12-01 11:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:47:52.828203791 +0000 UTC m=+6520.336992848" watchObservedRunningTime="2025-12-01 11:47:52.84120165 +0000 UTC m=+6520.349990697" Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.047237 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8wzxc"] Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.058463 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8wzxc"] Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.817208 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931deaef-d313-49ba-9fde-1d7eeb131d09" path="/var/lib/kubelet/pods/931deaef-d313-49ba-9fde-1d7eeb131d09/volumes" Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.922928 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.924195 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.962464 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 11:47:53 crc kubenswrapper[4958]: I1201 11:47:53.963865 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 11:47:54 crc kubenswrapper[4958]: I1201 11:47:54.815748 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 11:47:54 crc kubenswrapper[4958]: I1201 11:47:54.816143 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 11:47:58 crc kubenswrapper[4958]: I1201 11:47:58.441662 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 11:47:58 crc kubenswrapper[4958]: I1201 11:47:58.442044 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 11:47:58 crc kubenswrapper[4958]: I1201 11:47:58.688150 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 11:47:58 crc kubenswrapper[4958]: I1201 11:47:58.738032 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:58 crc kubenswrapper[4958]: I1201 11:47:58.739987 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:47:59 crc kubenswrapper[4958]: I1201 11:47:59.102050 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:47:59 crc kubenswrapper[4958]: I1201 11:47:59.620597 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 01 11:48:00 crc kubenswrapper[4958]: I1201 11:48:00.085092 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:00 crc kubenswrapper[4958]: I1201 11:48:00.085463 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:00 crc kubenswrapper[4958]: I1201 11:48:00.194965 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:00 crc kubenswrapper[4958]: I1201 11:48:00.210625 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:00 crc kubenswrapper[4958]: I1201 11:48:00.889278 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:00 crc kubenswrapper[4958]: I1201 11:48:00.889352 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:03 crc kubenswrapper[4958]: I1201 11:48:03.059324 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:03 crc kubenswrapper[4958]: I1201 11:48:03.059710 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 11:48:03 crc kubenswrapper[4958]: I1201 11:48:03.060150 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 11:48:03 crc kubenswrapper[4958]: I1201 11:48:03.807449 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:48:03 crc kubenswrapper[4958]: E1201 11:48:03.807934 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:48:04 crc kubenswrapper[4958]: I1201 11:48:04.056251 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-94c0-account-create-kcdk5"] Dec 01 11:48:04 crc kubenswrapper[4958]: I1201 11:48:04.072236 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-94c0-account-create-kcdk5"] Dec 01 11:48:05 crc kubenswrapper[4958]: I1201 11:48:05.820818 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564ff6dd-d3c4-48d0-b416-0b3caee09947" path="/var/lib/kubelet/pods/564ff6dd-d3c4-48d0-b416-0b3caee09947/volumes" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.399513 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrt97"] Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.402801 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.404357 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-catalog-content\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.404629 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-utilities\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.404680 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjxjz\" (UniqueName: \"kubernetes.io/projected/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-kube-api-access-qjxjz\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.452290 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrt97"] Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.506026 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-utilities\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.506092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjxjz\" (UniqueName: \"kubernetes.io/projected/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-kube-api-access-qjxjz\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.506179 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-catalog-content\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.506545 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-utilities\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.506580 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-catalog-content\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.543065 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjxjz\" (UniqueName: \"kubernetes.io/projected/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-kube-api-access-qjxjz\") pod \"redhat-operators-zrt97\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.735620 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-86c5b688c5-f62bb" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Dec 01 11:48:08 crc kubenswrapper[4958]: I1201 11:48:08.749464 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:09 crc kubenswrapper[4958]: W1201 11:48:09.294974 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf5fd605_4d2a_4b9d_87a0_448c46d4ab1a.slice/crio-ed3fc11b194d7bda12038ff909d0dd30bca03dd007c90f2e994d83005b997ba5 WatchSource:0}: Error finding container ed3fc11b194d7bda12038ff909d0dd30bca03dd007c90f2e994d83005b997ba5: Status 404 returned error can't find the container with id ed3fc11b194d7bda12038ff909d0dd30bca03dd007c90f2e994d83005b997ba5 Dec 01 11:48:09 crc kubenswrapper[4958]: I1201 11:48:09.320215 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrt97"] Dec 01 11:48:09 crc kubenswrapper[4958]: I1201 11:48:09.618452 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 01 11:48:09 crc kubenswrapper[4958]: I1201 11:48:09.981923 4958 generic.go:334] "Generic (PLEG): container finished" podID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerID="6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7" exitCode=0 Dec 01 11:48:09 crc kubenswrapper[4958]: I1201 11:48:09.981995 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerDied","Data":"6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7"} Dec 01 11:48:09 crc kubenswrapper[4958]: I1201 11:48:09.982023 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerStarted","Data":"ed3fc11b194d7bda12038ff909d0dd30bca03dd007c90f2e994d83005b997ba5"} Dec 01 11:48:11 crc kubenswrapper[4958]: I1201 11:48:11.037186 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-p67pw"] Dec 01 11:48:11 crc kubenswrapper[4958]: I1201 11:48:11.049384 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-p67pw"] Dec 01 11:48:12 crc kubenswrapper[4958]: I1201 11:48:12.233450 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af559844-10a7-49dd-8b8a-58885d2c2206" path="/var/lib/kubelet/pods/af559844-10a7-49dd-8b8a-58885d2c2206/volumes" Dec 01 11:48:12 crc kubenswrapper[4958]: I1201 11:48:12.241577 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerStarted","Data":"7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb"} Dec 01 11:48:15 crc kubenswrapper[4958]: I1201 11:48:15.301362 4958 generic.go:334] "Generic (PLEG): container finished" podID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerID="7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb" exitCode=0 Dec 01 11:48:15 crc kubenswrapper[4958]: I1201 11:48:15.301449 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerDied","Data":"7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb"} Dec 01 11:48:16 crc kubenswrapper[4958]: I1201 11:48:16.319958 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerStarted","Data":"a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708"} Dec 01 11:48:16 crc kubenswrapper[4958]: I1201 11:48:16.349029 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrt97" podStartSLOduration=2.579242061 podStartE2EDuration="8.34900637s" podCreationTimestamp="2025-12-01 11:48:08 +0000 UTC" firstStartedPulling="2025-12-01 11:48:09.983974472 +0000 UTC m=+6537.492763499" lastFinishedPulling="2025-12-01 11:48:15.753738771 +0000 UTC m=+6543.262527808" observedRunningTime="2025-12-01 11:48:16.341356153 +0000 UTC m=+6543.850145190" watchObservedRunningTime="2025-12-01 11:48:16.34900637 +0000 UTC m=+6543.857795407" Dec 01 11:48:16 crc kubenswrapper[4958]: I1201 11:48:16.797170 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:48:16 crc kubenswrapper[4958]: E1201 11:48:16.797457 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:48:17 crc kubenswrapper[4958]: I1201 11:48:17.427185 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.42:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 11:48:17 crc kubenswrapper[4958]: I1201 11:48:17.427205 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="735d94c9-1a28-41d5-ad7b-ce9d247bdff7" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.42:9292/healthcheck\": dial tcp 10.217.1.42:9292: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 01 11:48:18 crc kubenswrapper[4958]: I1201 11:48:18.750278 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:18 crc kubenswrapper[4958]: I1201 11:48:18.750582 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:19 crc kubenswrapper[4958]: I1201 11:48:19.795772 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrt97" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="registry-server" probeResult="failure" output=< Dec 01 11:48:19 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:48:19 crc kubenswrapper[4958]: > Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.214056 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340012 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a83948e-ec95-47a6-bdf9-c73444293d67-logs\") pod \"3a83948e-ec95-47a6-bdf9-c73444293d67\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340111 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprt7\" (UniqueName: \"kubernetes.io/projected/3a83948e-ec95-47a6-bdf9-c73444293d67-kube-api-access-jprt7\") pod \"3a83948e-ec95-47a6-bdf9-c73444293d67\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a83948e-ec95-47a6-bdf9-c73444293d67-horizon-secret-key\") pod \"3a83948e-ec95-47a6-bdf9-c73444293d67\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340196 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-config-data\") pod \"3a83948e-ec95-47a6-bdf9-c73444293d67\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340223 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-scripts\") pod \"3a83948e-ec95-47a6-bdf9-c73444293d67\" (UID: \"3a83948e-ec95-47a6-bdf9-c73444293d67\") " Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340387 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a83948e-ec95-47a6-bdf9-c73444293d67-logs" (OuterVolumeSpecName: "logs") pod "3a83948e-ec95-47a6-bdf9-c73444293d67" (UID: "3a83948e-ec95-47a6-bdf9-c73444293d67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.340688 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a83948e-ec95-47a6-bdf9-c73444293d67-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.351012 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a83948e-ec95-47a6-bdf9-c73444293d67-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3a83948e-ec95-47a6-bdf9-c73444293d67" (UID: "3a83948e-ec95-47a6-bdf9-c73444293d67"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.351177 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a83948e-ec95-47a6-bdf9-c73444293d67-kube-api-access-jprt7" (OuterVolumeSpecName: "kube-api-access-jprt7") pod "3a83948e-ec95-47a6-bdf9-c73444293d67" (UID: "3a83948e-ec95-47a6-bdf9-c73444293d67"). InnerVolumeSpecName "kube-api-access-jprt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.365723 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-scripts" (OuterVolumeSpecName: "scripts") pod "3a83948e-ec95-47a6-bdf9-c73444293d67" (UID: "3a83948e-ec95-47a6-bdf9-c73444293d67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367598 4958 generic.go:334] "Generic (PLEG): container finished" podID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerID="7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b" exitCode=137 Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367635 4958 generic.go:334] "Generic (PLEG): container finished" podID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerID="7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441" exitCode=137 Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367658 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69dfb9ff85-g2xdt" event={"ID":"3a83948e-ec95-47a6-bdf9-c73444293d67","Type":"ContainerDied","Data":"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b"} Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367692 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69dfb9ff85-g2xdt" event={"ID":"3a83948e-ec95-47a6-bdf9-c73444293d67","Type":"ContainerDied","Data":"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441"} Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367710 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69dfb9ff85-g2xdt" event={"ID":"3a83948e-ec95-47a6-bdf9-c73444293d67","Type":"ContainerDied","Data":"cdcefcd66a7750c2fc83bb75e131400d5ca6b77ec1446ba2cc072492fd9d92c0"} Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367717 4958 scope.go:117] "RemoveContainer" containerID="7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.367744 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69dfb9ff85-g2xdt" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.371654 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-config-data" (OuterVolumeSpecName: "config-data") pod "3a83948e-ec95-47a6-bdf9-c73444293d67" (UID: "3a83948e-ec95-47a6-bdf9-c73444293d67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.443138 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprt7\" (UniqueName: \"kubernetes.io/projected/3a83948e-ec95-47a6-bdf9-c73444293d67-kube-api-access-jprt7\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.443192 4958 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3a83948e-ec95-47a6-bdf9-c73444293d67-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.443203 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.443212 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a83948e-ec95-47a6-bdf9-c73444293d67-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.568467 4958 scope.go:117] "RemoveContainer" containerID="7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.592974 4958 scope.go:117] "RemoveContainer" containerID="7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b" Dec 01 11:48:20 crc kubenswrapper[4958]: E1201 11:48:20.593420 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b\": container with ID starting with 7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b not found: ID does not exist" containerID="7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.593455 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b"} err="failed to get container status \"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b\": rpc error: code = NotFound desc = could not find container \"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b\": container with ID starting with 7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b not found: ID does not exist" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.593493 4958 scope.go:117] "RemoveContainer" containerID="7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441" Dec 01 11:48:20 crc kubenswrapper[4958]: E1201 11:48:20.593919 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441\": container with ID starting with 7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441 not found: ID does not exist" containerID="7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.593964 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441"} err="failed to get container status \"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441\": rpc error: code = NotFound desc = could not find container \"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441\": container with ID starting with 7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441 not found: ID does not exist" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.593982 4958 scope.go:117] "RemoveContainer" containerID="7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.594258 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b"} err="failed to get container status \"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b\": rpc error: code = NotFound desc = could not find container \"7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b\": container with ID starting with 7109ec3f32829dd86c6cf07f6759f16a5bf7dbccd2213fd2d849b834d669790b not found: ID does not exist" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.594296 4958 scope.go:117] "RemoveContainer" containerID="7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.594627 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441"} err="failed to get container status \"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441\": rpc error: code = NotFound desc = could not find container \"7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441\": container with ID starting with 7b0be99e9ad2c7b8891b00a308ad9c64b9111f8c861da09b8fd9ad9aa02a1441 not found: ID does not exist" Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.840554 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-69dfb9ff85-g2xdt"] Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.849812 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-69dfb9ff85-g2xdt"] Dec 01 11:48:20 crc kubenswrapper[4958]: I1201 11:48:20.924581 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:48:21 crc kubenswrapper[4958]: I1201 11:48:21.770067 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:48:21 crc kubenswrapper[4958]: I1201 11:48:21.809946 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" path="/var/lib/kubelet/pods/3a83948e-ec95-47a6-bdf9-c73444293d67/volumes" Dec 01 11:48:22 crc kubenswrapper[4958]: I1201 11:48:22.938536 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:48:23 crc kubenswrapper[4958]: I1201 11:48:23.813559 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:48:23 crc kubenswrapper[4958]: I1201 11:48:23.906627 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86c5b688c5-f62bb"] Dec 01 11:48:23 crc kubenswrapper[4958]: I1201 11:48:23.907075 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86c5b688c5-f62bb" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon-log" containerID="cri-o://04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b" gracePeriod=30 Dec 01 11:48:23 crc kubenswrapper[4958]: I1201 11:48:23.907245 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86c5b688c5-f62bb" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" containerID="cri-o://713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce" gracePeriod=30 Dec 01 11:48:26 crc kubenswrapper[4958]: I1201 11:48:26.524777 4958 scope.go:117] "RemoveContainer" containerID="67ac8dbad7bcf79f5ad5c4e97f9bc12154b981b5a1effdba6b9fb8ef6a99c173" Dec 01 11:48:26 crc kubenswrapper[4958]: I1201 11:48:26.559603 4958 scope.go:117] "RemoveContainer" containerID="aeb17b3447c9cc5280dccaecab40d5c30cef30ed3c76fea37040b51f2e7022e0" Dec 01 11:48:26 crc kubenswrapper[4958]: I1201 11:48:26.619519 4958 scope.go:117] "RemoveContainer" containerID="606db69884e0cf0fb653ac2576d29d94f1bc4d8352295954366cc46f9c689c83" Dec 01 11:48:27 crc kubenswrapper[4958]: I1201 11:48:27.454604 4958 generic.go:334] "Generic (PLEG): container finished" podID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerID="713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce" exitCode=0 Dec 01 11:48:27 crc kubenswrapper[4958]: I1201 11:48:27.454673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c5b688c5-f62bb" event={"ID":"76e3aa33-cc8c-46db-9d82-cb36fe966a99","Type":"ContainerDied","Data":"713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce"} Dec 01 11:48:28 crc kubenswrapper[4958]: I1201 11:48:28.735112 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86c5b688c5-f62bb" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Dec 01 11:48:29 crc kubenswrapper[4958]: I1201 11:48:29.796683 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zrt97" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="registry-server" probeResult="failure" output=< Dec 01 11:48:29 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:48:29 crc kubenswrapper[4958]: > Dec 01 11:48:29 crc kubenswrapper[4958]: I1201 11:48:29.799061 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:48:29 crc kubenswrapper[4958]: E1201 11:48:29.799646 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:48:38 crc kubenswrapper[4958]: I1201 11:48:38.734975 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86c5b688c5-f62bb" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Dec 01 11:48:38 crc kubenswrapper[4958]: I1201 11:48:38.821648 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:38 crc kubenswrapper[4958]: I1201 11:48:38.900695 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:39 crc kubenswrapper[4958]: I1201 11:48:39.576181 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrt97"] Dec 01 11:48:40 crc kubenswrapper[4958]: I1201 11:48:40.705300 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zrt97" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="registry-server" containerID="cri-o://a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708" gracePeriod=2 Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.247255 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.426324 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-catalog-content\") pod \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.426413 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-utilities\") pod \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.426505 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjxjz\" (UniqueName: \"kubernetes.io/projected/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-kube-api-access-qjxjz\") pod \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\" (UID: \"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a\") " Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.428004 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-utilities" (OuterVolumeSpecName: "utilities") pod "df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" (UID: "df5fd605-4d2a-4b9d-87a0-448c46d4ab1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.437021 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-kube-api-access-qjxjz" (OuterVolumeSpecName: "kube-api-access-qjxjz") pod "df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" (UID: "df5fd605-4d2a-4b9d-87a0-448c46d4ab1a"). InnerVolumeSpecName "kube-api-access-qjxjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.529260 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.529312 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjxjz\" (UniqueName: \"kubernetes.io/projected/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-kube-api-access-qjxjz\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.572520 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" (UID: "df5fd605-4d2a-4b9d-87a0-448c46d4ab1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.632356 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.721638 4958 generic.go:334] "Generic (PLEG): container finished" podID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerID="a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708" exitCode=0 Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.721710 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrt97" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.721730 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerDied","Data":"a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708"} Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.721942 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrt97" event={"ID":"df5fd605-4d2a-4b9d-87a0-448c46d4ab1a","Type":"ContainerDied","Data":"ed3fc11b194d7bda12038ff909d0dd30bca03dd007c90f2e994d83005b997ba5"} Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.721989 4958 scope.go:117] "RemoveContainer" containerID="a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.769934 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrt97"] Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.774629 4958 scope.go:117] "RemoveContainer" containerID="7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.780094 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zrt97"] Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.805416 4958 scope.go:117] "RemoveContainer" containerID="6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.813492 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" path="/var/lib/kubelet/pods/df5fd605-4d2a-4b9d-87a0-448c46d4ab1a/volumes" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.851316 4958 scope.go:117] "RemoveContainer" containerID="a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708" Dec 01 11:48:41 crc kubenswrapper[4958]: E1201 11:48:41.851708 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708\": container with ID starting with a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708 not found: ID does not exist" containerID="a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.851760 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708"} err="failed to get container status \"a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708\": rpc error: code = NotFound desc = could not find container \"a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708\": container with ID starting with a173f25142fb1fea59917bfca0f2486cd78c4dd1694e2e441da4865764d62708 not found: ID does not exist" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.851794 4958 scope.go:117] "RemoveContainer" containerID="7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb" Dec 01 11:48:41 crc kubenswrapper[4958]: E1201 11:48:41.852098 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb\": container with ID starting with 7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb not found: ID does not exist" containerID="7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.852125 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb"} err="failed to get container status \"7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb\": rpc error: code = NotFound desc = could not find container \"7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb\": container with ID starting with 7338e721f14bcd6ab5544c631fa9259ffeec99f36c8720fb5f66eb75330119cb not found: ID does not exist" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.852146 4958 scope.go:117] "RemoveContainer" containerID="6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7" Dec 01 11:48:41 crc kubenswrapper[4958]: E1201 11:48:41.852350 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7\": container with ID starting with 6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7 not found: ID does not exist" containerID="6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7" Dec 01 11:48:41 crc kubenswrapper[4958]: I1201 11:48:41.852378 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7"} err="failed to get container status \"6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7\": rpc error: code = NotFound desc = could not find container \"6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7\": container with ID starting with 6434ed5e3c325221afa1413997981c2236258cf69dac6ef495b13abc8dd290d7 not found: ID does not exist" Dec 01 11:48:42 crc kubenswrapper[4958]: I1201 11:48:42.064288 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ssvm7"] Dec 01 11:48:42 crc kubenswrapper[4958]: I1201 11:48:42.081700 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ssvm7"] Dec 01 11:48:43 crc kubenswrapper[4958]: I1201 11:48:43.814318 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:48:43 crc kubenswrapper[4958]: E1201 11:48:43.815046 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:48:43 crc kubenswrapper[4958]: I1201 11:48:43.819310 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bed473b-45f4-4fbf-b7ea-c23b554f578d" path="/var/lib/kubelet/pods/1bed473b-45f4-4fbf-b7ea-c23b554f578d/volumes" Dec 01 11:48:48 crc kubenswrapper[4958]: I1201 11:48:48.735809 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-86c5b688c5-f62bb" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Dec 01 11:48:48 crc kubenswrapper[4958]: I1201 11:48:48.736885 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:48:52 crc kubenswrapper[4958]: I1201 11:48:52.058417 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-49a4-account-create-dh9qp"] Dec 01 11:48:52 crc kubenswrapper[4958]: I1201 11:48:52.070884 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-49a4-account-create-dh9qp"] Dec 01 11:48:53 crc kubenswrapper[4958]: I1201 11:48:53.811186 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68bf25e4-bda1-497a-9510-4a06a63e5138" path="/var/lib/kubelet/pods/68bf25e4-bda1-497a-9510-4a06a63e5138/volumes" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.390490 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.502391 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxs65\" (UniqueName: \"kubernetes.io/projected/76e3aa33-cc8c-46db-9d82-cb36fe966a99-kube-api-access-hxs65\") pod \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.503356 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e3aa33-cc8c-46db-9d82-cb36fe966a99-logs\") pod \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.503440 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-config-data\") pod \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.503538 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e3aa33-cc8c-46db-9d82-cb36fe966a99-horizon-secret-key\") pod \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.503696 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-scripts\") pod \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\" (UID: \"76e3aa33-cc8c-46db-9d82-cb36fe966a99\") " Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.504431 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e3aa33-cc8c-46db-9d82-cb36fe966a99-logs" (OuterVolumeSpecName: "logs") pod "76e3aa33-cc8c-46db-9d82-cb36fe966a99" (UID: "76e3aa33-cc8c-46db-9d82-cb36fe966a99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.504988 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76e3aa33-cc8c-46db-9d82-cb36fe966a99-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.511251 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e3aa33-cc8c-46db-9d82-cb36fe966a99-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "76e3aa33-cc8c-46db-9d82-cb36fe966a99" (UID: "76e3aa33-cc8c-46db-9d82-cb36fe966a99"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.511707 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e3aa33-cc8c-46db-9d82-cb36fe966a99-kube-api-access-hxs65" (OuterVolumeSpecName: "kube-api-access-hxs65") pod "76e3aa33-cc8c-46db-9d82-cb36fe966a99" (UID: "76e3aa33-cc8c-46db-9d82-cb36fe966a99"). InnerVolumeSpecName "kube-api-access-hxs65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.536872 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-config-data" (OuterVolumeSpecName: "config-data") pod "76e3aa33-cc8c-46db-9d82-cb36fe966a99" (UID: "76e3aa33-cc8c-46db-9d82-cb36fe966a99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.537937 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-scripts" (OuterVolumeSpecName: "scripts") pod "76e3aa33-cc8c-46db-9d82-cb36fe966a99" (UID: "76e3aa33-cc8c-46db-9d82-cb36fe966a99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.607226 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.607470 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxs65\" (UniqueName: \"kubernetes.io/projected/76e3aa33-cc8c-46db-9d82-cb36fe966a99-kube-api-access-hxs65\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.607481 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76e3aa33-cc8c-46db-9d82-cb36fe966a99-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:54 crc kubenswrapper[4958]: I1201 11:48:54.607489 4958 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76e3aa33-cc8c-46db-9d82-cb36fe966a99-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.003145 4958 generic.go:334] "Generic (PLEG): container finished" podID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerID="04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b" exitCode=137 Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.003204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c5b688c5-f62bb" event={"ID":"76e3aa33-cc8c-46db-9d82-cb36fe966a99","Type":"ContainerDied","Data":"04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b"} Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.003255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c5b688c5-f62bb" event={"ID":"76e3aa33-cc8c-46db-9d82-cb36fe966a99","Type":"ContainerDied","Data":"4e996753158e55d8d4f502494e2db53c005179f2c3cceeceecc5ae94fafcc6e7"} Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.003284 4958 scope.go:117] "RemoveContainer" containerID="713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.003299 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c5b688c5-f62bb" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.063914 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86c5b688c5-f62bb"] Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.073265 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86c5b688c5-f62bb"] Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.224862 4958 scope.go:117] "RemoveContainer" containerID="04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.251030 4958 scope.go:117] "RemoveContainer" containerID="713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce" Dec 01 11:48:55 crc kubenswrapper[4958]: E1201 11:48:55.251449 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce\": container with ID starting with 713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce not found: ID does not exist" containerID="713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.251500 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce"} err="failed to get container status \"713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce\": rpc error: code = NotFound desc = could not find container \"713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce\": container with ID starting with 713c3b6224616850fcab1fd7aed8ab1fe6676f1c27f00bf5cb337b9d1075b6ce not found: ID does not exist" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.251524 4958 scope.go:117] "RemoveContainer" containerID="04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b" Dec 01 11:48:55 crc kubenswrapper[4958]: E1201 11:48:55.251959 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b\": container with ID starting with 04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b not found: ID does not exist" containerID="04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.251982 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b"} err="failed to get container status \"04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b\": rpc error: code = NotFound desc = could not find container \"04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b\": container with ID starting with 04348c2b6ce90180d61474c9a74d4eb2c23b65c961542826f97d7de4dcc73b8b not found: ID does not exist" Dec 01 11:48:55 crc kubenswrapper[4958]: I1201 11:48:55.822483 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" path="/var/lib/kubelet/pods/76e3aa33-cc8c-46db-9d82-cb36fe966a99/volumes" Dec 01 11:48:57 crc kubenswrapper[4958]: I1201 11:48:57.798357 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:48:57 crc kubenswrapper[4958]: E1201 11:48:57.800217 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:48:59 crc kubenswrapper[4958]: I1201 11:48:59.044044 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-kfczd"] Dec 01 11:48:59 crc kubenswrapper[4958]: I1201 11:48:59.053296 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-kfczd"] Dec 01 11:48:59 crc kubenswrapper[4958]: I1201 11:48:59.820464 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71967bd5-6523-4a00-86c5-a43c5994f71a" path="/var/lib/kubelet/pods/71967bd5-6523-4a00-86c5-a43c5994f71a/volumes" Dec 01 11:49:11 crc kubenswrapper[4958]: I1201 11:49:11.797902 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:49:11 crc kubenswrapper[4958]: E1201 11:49:11.799150 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:49:26 crc kubenswrapper[4958]: I1201 11:49:26.797627 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:49:26 crc kubenswrapper[4958]: E1201 11:49:26.798448 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:49:26 crc kubenswrapper[4958]: I1201 11:49:26.847377 4958 scope.go:117] "RemoveContainer" containerID="a423a22c79025a9e0f919f5d574502a05822e350af70fbc09516ebd09cd7ea35" Dec 01 11:49:26 crc kubenswrapper[4958]: I1201 11:49:26.872968 4958 scope.go:117] "RemoveContainer" containerID="31696ea308d2423cc2605ca9821bc97f8e226233d3cc9b0bf9f306681d3d08b8" Dec 01 11:49:26 crc kubenswrapper[4958]: I1201 11:49:26.935680 4958 scope.go:117] "RemoveContainer" containerID="83a43c55ef38b8c052a5ceebe966a0e40a437d035eea8358f995e418d88f7351" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.510815 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65dfcfb885-6c2cp"] Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.511993 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512014 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.512038 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512046 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon" Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.512062 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="extract-content" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512071 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="extract-content" Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.512284 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon-log" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512292 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon-log" Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.512324 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="extract-utilities" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512334 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="extract-utilities" Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.512344 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="registry-server" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512351 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="registry-server" Dec 01 11:49:31 crc kubenswrapper[4958]: E1201 11:49:31.512376 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon-log" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512384 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon-log" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512647 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512667 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5fd605-4d2a-4b9d-87a0-448c46d4ab1a" containerName="registry-server" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512687 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon-log" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512694 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e3aa33-cc8c-46db-9d82-cb36fe966a99" containerName="horizon" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.512707 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a83948e-ec95-47a6-bdf9-c73444293d67" containerName="horizon-log" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.514158 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.532454 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca004e01-200e-461c-ad14-68752150d940-horizon-secret-key\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.532604 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca004e01-200e-461c-ad14-68752150d940-scripts\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.532736 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxjz\" (UniqueName: \"kubernetes.io/projected/ca004e01-200e-461c-ad14-68752150d940-kube-api-access-2lxjz\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.533003 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca004e01-200e-461c-ad14-68752150d940-config-data\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.533064 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca004e01-200e-461c-ad14-68752150d940-logs\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.536817 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65dfcfb885-6c2cp"] Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.634529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca004e01-200e-461c-ad14-68752150d940-config-data\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.634588 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca004e01-200e-461c-ad14-68752150d940-logs\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.634640 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca004e01-200e-461c-ad14-68752150d940-horizon-secret-key\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.634709 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca004e01-200e-461c-ad14-68752150d940-scripts\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.634761 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxjz\" (UniqueName: \"kubernetes.io/projected/ca004e01-200e-461c-ad14-68752150d940-kube-api-access-2lxjz\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.635627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca004e01-200e-461c-ad14-68752150d940-logs\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.636307 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca004e01-200e-461c-ad14-68752150d940-scripts\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.636575 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca004e01-200e-461c-ad14-68752150d940-config-data\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.642270 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca004e01-200e-461c-ad14-68752150d940-horizon-secret-key\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.652601 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxjz\" (UniqueName: \"kubernetes.io/projected/ca004e01-200e-461c-ad14-68752150d940-kube-api-access-2lxjz\") pod \"horizon-65dfcfb885-6c2cp\" (UID: \"ca004e01-200e-461c-ad14-68752150d940\") " pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:31 crc kubenswrapper[4958]: I1201 11:49:31.935528 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:32 crc kubenswrapper[4958]: I1201 11:49:32.508374 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65dfcfb885-6c2cp"] Dec 01 11:49:32 crc kubenswrapper[4958]: I1201 11:49:32.514390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65dfcfb885-6c2cp" event={"ID":"ca004e01-200e-461c-ad14-68752150d940","Type":"ContainerStarted","Data":"2efe394208db1f6d49cf0e26ca5114c0c5b2bb45a7d4eca1f91e4fafc2d0d144"} Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.166490 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-s9sqt"] Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.168506 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.191389 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-s9sqt"] Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.321212 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tjtq\" (UniqueName: \"kubernetes.io/projected/eb73270a-b4ea-4b40-8c82-9e9ceb53a719-kube-api-access-5tjtq\") pod \"heat-db-create-s9sqt\" (UID: \"eb73270a-b4ea-4b40-8c82-9e9ceb53a719\") " pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.424251 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tjtq\" (UniqueName: \"kubernetes.io/projected/eb73270a-b4ea-4b40-8c82-9e9ceb53a719-kube-api-access-5tjtq\") pod \"heat-db-create-s9sqt\" (UID: \"eb73270a-b4ea-4b40-8c82-9e9ceb53a719\") " pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.446538 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tjtq\" (UniqueName: \"kubernetes.io/projected/eb73270a-b4ea-4b40-8c82-9e9ceb53a719-kube-api-access-5tjtq\") pod \"heat-db-create-s9sqt\" (UID: \"eb73270a-b4ea-4b40-8c82-9e9ceb53a719\") " pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.496435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.526167 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65dfcfb885-6c2cp" event={"ID":"ca004e01-200e-461c-ad14-68752150d940","Type":"ContainerStarted","Data":"8a3a061410909e5844027bcebf9bbfcafe3d7b3fe4f67633054bc40a0d015eaa"} Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.526217 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65dfcfb885-6c2cp" event={"ID":"ca004e01-200e-461c-ad14-68752150d940","Type":"ContainerStarted","Data":"f94020d5bb2d8da371d6e0b19e7f166a93ef0db98daef4d5bdcf32d21ed072b8"} Dec 01 11:49:33 crc kubenswrapper[4958]: I1201 11:49:33.559033 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-65dfcfb885-6c2cp" podStartSLOduration=2.559008468 podStartE2EDuration="2.559008468s" podCreationTimestamp="2025-12-01 11:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:49:33.551568297 +0000 UTC m=+6621.060357334" watchObservedRunningTime="2025-12-01 11:49:33.559008468 +0000 UTC m=+6621.067797505" Dec 01 11:49:34 crc kubenswrapper[4958]: I1201 11:49:34.015829 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-s9sqt"] Dec 01 11:49:34 crc kubenswrapper[4958]: W1201 11:49:34.020506 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb73270a_b4ea_4b40_8c82_9e9ceb53a719.slice/crio-18ce6589c6d1b45eabe40c947e37834b953dc78bab31584847c35de779d0fed5 WatchSource:0}: Error finding container 18ce6589c6d1b45eabe40c947e37834b953dc78bab31584847c35de779d0fed5: Status 404 returned error can't find the container with id 18ce6589c6d1b45eabe40c947e37834b953dc78bab31584847c35de779d0fed5 Dec 01 11:49:34 crc kubenswrapper[4958]: I1201 11:49:34.539957 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb73270a-b4ea-4b40-8c82-9e9ceb53a719" containerID="47452c900b775e2f7df3b71580d8f579f7d88d78ed88bdcd16ce6e99655045a1" exitCode=0 Dec 01 11:49:34 crc kubenswrapper[4958]: I1201 11:49:34.540004 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-s9sqt" event={"ID":"eb73270a-b4ea-4b40-8c82-9e9ceb53a719","Type":"ContainerDied","Data":"47452c900b775e2f7df3b71580d8f579f7d88d78ed88bdcd16ce6e99655045a1"} Dec 01 11:49:34 crc kubenswrapper[4958]: I1201 11:49:34.540213 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-s9sqt" event={"ID":"eb73270a-b4ea-4b40-8c82-9e9ceb53a719","Type":"ContainerStarted","Data":"18ce6589c6d1b45eabe40c947e37834b953dc78bab31584847c35de779d0fed5"} Dec 01 11:49:35 crc kubenswrapper[4958]: I1201 11:49:35.928518 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:36 crc kubenswrapper[4958]: I1201 11:49:36.066559 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tjtq\" (UniqueName: \"kubernetes.io/projected/eb73270a-b4ea-4b40-8c82-9e9ceb53a719-kube-api-access-5tjtq\") pod \"eb73270a-b4ea-4b40-8c82-9e9ceb53a719\" (UID: \"eb73270a-b4ea-4b40-8c82-9e9ceb53a719\") " Dec 01 11:49:36 crc kubenswrapper[4958]: I1201 11:49:36.078639 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb73270a-b4ea-4b40-8c82-9e9ceb53a719-kube-api-access-5tjtq" (OuterVolumeSpecName: "kube-api-access-5tjtq") pod "eb73270a-b4ea-4b40-8c82-9e9ceb53a719" (UID: "eb73270a-b4ea-4b40-8c82-9e9ceb53a719"). InnerVolumeSpecName "kube-api-access-5tjtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:49:36 crc kubenswrapper[4958]: I1201 11:49:36.169550 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tjtq\" (UniqueName: \"kubernetes.io/projected/eb73270a-b4ea-4b40-8c82-9e9ceb53a719-kube-api-access-5tjtq\") on node \"crc\" DevicePath \"\"" Dec 01 11:49:36 crc kubenswrapper[4958]: I1201 11:49:36.564063 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-s9sqt" event={"ID":"eb73270a-b4ea-4b40-8c82-9e9ceb53a719","Type":"ContainerDied","Data":"18ce6589c6d1b45eabe40c947e37834b953dc78bab31584847c35de779d0fed5"} Dec 01 11:49:36 crc kubenswrapper[4958]: I1201 11:49:36.564117 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ce6589c6d1b45eabe40c947e37834b953dc78bab31584847c35de779d0fed5" Dec 01 11:49:36 crc kubenswrapper[4958]: I1201 11:49:36.564141 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-s9sqt" Dec 01 11:49:41 crc kubenswrapper[4958]: I1201 11:49:41.798650 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:49:41 crc kubenswrapper[4958]: E1201 11:49:41.800511 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:49:41 crc kubenswrapper[4958]: I1201 11:49:41.937010 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:41 crc kubenswrapper[4958]: I1201 11:49:41.937368 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.182408 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-a906-account-create-kmtzz"] Dec 01 11:49:43 crc kubenswrapper[4958]: E1201 11:49:43.183291 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb73270a-b4ea-4b40-8c82-9e9ceb53a719" containerName="mariadb-database-create" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.183333 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb73270a-b4ea-4b40-8c82-9e9ceb53a719" containerName="mariadb-database-create" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.183629 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb73270a-b4ea-4b40-8c82-9e9ceb53a719" containerName="mariadb-database-create" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.184600 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.186494 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.193996 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a906-account-create-kmtzz"] Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.348830 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gd98\" (UniqueName: \"kubernetes.io/projected/1d87b363-c828-4b73-8c31-b3dd67ebce08-kube-api-access-7gd98\") pod \"heat-a906-account-create-kmtzz\" (UID: \"1d87b363-c828-4b73-8c31-b3dd67ebce08\") " pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.451407 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gd98\" (UniqueName: \"kubernetes.io/projected/1d87b363-c828-4b73-8c31-b3dd67ebce08-kube-api-access-7gd98\") pod \"heat-a906-account-create-kmtzz\" (UID: \"1d87b363-c828-4b73-8c31-b3dd67ebce08\") " pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.473065 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gd98\" (UniqueName: \"kubernetes.io/projected/1d87b363-c828-4b73-8c31-b3dd67ebce08-kube-api-access-7gd98\") pod \"heat-a906-account-create-kmtzz\" (UID: \"1d87b363-c828-4b73-8c31-b3dd67ebce08\") " pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:43 crc kubenswrapper[4958]: I1201 11:49:43.509268 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:44 crc kubenswrapper[4958]: I1201 11:49:44.040583 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-a906-account-create-kmtzz"] Dec 01 11:49:44 crc kubenswrapper[4958]: I1201 11:49:44.698359 4958 generic.go:334] "Generic (PLEG): container finished" podID="1d87b363-c828-4b73-8c31-b3dd67ebce08" containerID="01445f920b6113475195b247d5fe27db4cec3d4ae1aa583c69d7942bd3fd3f92" exitCode=0 Dec 01 11:49:44 crc kubenswrapper[4958]: I1201 11:49:44.700563 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a906-account-create-kmtzz" event={"ID":"1d87b363-c828-4b73-8c31-b3dd67ebce08","Type":"ContainerDied","Data":"01445f920b6113475195b247d5fe27db4cec3d4ae1aa583c69d7942bd3fd3f92"} Dec 01 11:49:44 crc kubenswrapper[4958]: I1201 11:49:44.700756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a906-account-create-kmtzz" event={"ID":"1d87b363-c828-4b73-8c31-b3dd67ebce08","Type":"ContainerStarted","Data":"f5c38d770a079729a960e53d0eac4e1f930192a810e00decf549eee814b95f50"} Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.114372 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.139408 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gd98\" (UniqueName: \"kubernetes.io/projected/1d87b363-c828-4b73-8c31-b3dd67ebce08-kube-api-access-7gd98\") pod \"1d87b363-c828-4b73-8c31-b3dd67ebce08\" (UID: \"1d87b363-c828-4b73-8c31-b3dd67ebce08\") " Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.148300 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d87b363-c828-4b73-8c31-b3dd67ebce08-kube-api-access-7gd98" (OuterVolumeSpecName: "kube-api-access-7gd98") pod "1d87b363-c828-4b73-8c31-b3dd67ebce08" (UID: "1d87b363-c828-4b73-8c31-b3dd67ebce08"). InnerVolumeSpecName "kube-api-access-7gd98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.242977 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gd98\" (UniqueName: \"kubernetes.io/projected/1d87b363-c828-4b73-8c31-b3dd67ebce08-kube-api-access-7gd98\") on node \"crc\" DevicePath \"\"" Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.722243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-a906-account-create-kmtzz" event={"ID":"1d87b363-c828-4b73-8c31-b3dd67ebce08","Type":"ContainerDied","Data":"f5c38d770a079729a960e53d0eac4e1f930192a810e00decf549eee814b95f50"} Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.722323 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5c38d770a079729a960e53d0eac4e1f930192a810e00decf549eee814b95f50" Dec 01 11:49:46 crc kubenswrapper[4958]: I1201 11:49:46.722369 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-a906-account-create-kmtzz" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.261916 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-xwjqn"] Dec 01 11:49:48 crc kubenswrapper[4958]: E1201 11:49:48.266537 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d87b363-c828-4b73-8c31-b3dd67ebce08" containerName="mariadb-account-create" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.266565 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d87b363-c828-4b73-8c31-b3dd67ebce08" containerName="mariadb-account-create" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.266830 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d87b363-c828-4b73-8c31-b3dd67ebce08" containerName="mariadb-account-create" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.267763 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.274740 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-6wp7q" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.275017 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.286457 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqq9\" (UniqueName: \"kubernetes.io/projected/5616721a-846c-416c-91cf-9275a529232b-kube-api-access-bhqq9\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.286512 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-combined-ca-bundle\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.286675 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-config-data\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.296377 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xwjqn"] Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.388443 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqq9\" (UniqueName: \"kubernetes.io/projected/5616721a-846c-416c-91cf-9275a529232b-kube-api-access-bhqq9\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.388504 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-combined-ca-bundle\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.388638 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-config-data\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.395003 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-combined-ca-bundle\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.407418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-config-data\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.412306 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqq9\" (UniqueName: \"kubernetes.io/projected/5616721a-846c-416c-91cf-9275a529232b-kube-api-access-bhqq9\") pod \"heat-db-sync-xwjqn\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:48 crc kubenswrapper[4958]: I1201 11:49:48.602745 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xwjqn" Dec 01 11:49:49 crc kubenswrapper[4958]: I1201 11:49:49.091122 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-xwjqn"] Dec 01 11:49:49 crc kubenswrapper[4958]: I1201 11:49:49.107541 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:49:49 crc kubenswrapper[4958]: I1201 11:49:49.753899 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xwjqn" event={"ID":"5616721a-846c-416c-91cf-9275a529232b","Type":"ContainerStarted","Data":"3f92e582cab505dee056567f045c74eb419197c0504dc5ce2135fa2fc3b0ce28"} Dec 01 11:49:51 crc kubenswrapper[4958]: I1201 11:49:51.938968 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-65dfcfb885-6c2cp" podUID="ca004e01-200e-461c-ad14-68752150d940" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Dec 01 11:49:52 crc kubenswrapper[4958]: I1201 11:49:52.798754 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:49:52 crc kubenswrapper[4958]: E1201 11:49:52.800816 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:49:56 crc kubenswrapper[4958]: I1201 11:49:56.046301 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-44xr7"] Dec 01 11:49:56 crc kubenswrapper[4958]: I1201 11:49:56.080134 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-n4kzb"] Dec 01 11:49:56 crc kubenswrapper[4958]: I1201 11:49:56.089024 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-n4kzb"] Dec 01 11:49:56 crc kubenswrapper[4958]: I1201 11:49:56.112056 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-44xr7"] Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.034737 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xn4w4"] Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.043656 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xn4w4"] Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.809128 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0188611e-a44a-49cf-9814-008b7487bb17" path="/var/lib/kubelet/pods/0188611e-a44a-49cf-9814-008b7487bb17/volumes" Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.810274 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf85e447-bf63-4cd6-a30c-b70b7bc75868" path="/var/lib/kubelet/pods/cf85e447-bf63-4cd6-a30c-b70b7bc75868/volumes" Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.810887 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df494e10-0447-4b53-a84e-82156db28933" path="/var/lib/kubelet/pods/df494e10-0447-4b53-a84e-82156db28933/volumes" Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.849165 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xwjqn" event={"ID":"5616721a-846c-416c-91cf-9275a529232b","Type":"ContainerStarted","Data":"416e999dd021c69d33b07c2df9d91a44d6d917f05144ed13ca27b0c284c5bd12"} Dec 01 11:49:57 crc kubenswrapper[4958]: I1201 11:49:57.880106 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-xwjqn" podStartSLOduration=2.435616543 podStartE2EDuration="9.880076873s" podCreationTimestamp="2025-12-01 11:49:48 +0000 UTC" firstStartedPulling="2025-12-01 11:49:49.107266355 +0000 UTC m=+6636.616055402" lastFinishedPulling="2025-12-01 11:49:56.551726695 +0000 UTC m=+6644.060515732" observedRunningTime="2025-12-01 11:49:57.866681223 +0000 UTC m=+6645.375470280" watchObservedRunningTime="2025-12-01 11:49:57.880076873 +0000 UTC m=+6645.388865950" Dec 01 11:49:59 crc kubenswrapper[4958]: I1201 11:49:59.880787 4958 generic.go:334] "Generic (PLEG): container finished" podID="5616721a-846c-416c-91cf-9275a529232b" containerID="416e999dd021c69d33b07c2df9d91a44d6d917f05144ed13ca27b0c284c5bd12" exitCode=0 Dec 01 11:49:59 crc kubenswrapper[4958]: I1201 11:49:59.880884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xwjqn" event={"ID":"5616721a-846c-416c-91cf-9275a529232b","Type":"ContainerDied","Data":"416e999dd021c69d33b07c2df9d91a44d6d917f05144ed13ca27b0c284c5bd12"} Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.316505 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xwjqn" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.461274 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-combined-ca-bundle\") pod \"5616721a-846c-416c-91cf-9275a529232b\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.462219 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqq9\" (UniqueName: \"kubernetes.io/projected/5616721a-846c-416c-91cf-9275a529232b-kube-api-access-bhqq9\") pod \"5616721a-846c-416c-91cf-9275a529232b\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.462564 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-config-data\") pod \"5616721a-846c-416c-91cf-9275a529232b\" (UID: \"5616721a-846c-416c-91cf-9275a529232b\") " Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.467331 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5616721a-846c-416c-91cf-9275a529232b-kube-api-access-bhqq9" (OuterVolumeSpecName: "kube-api-access-bhqq9") pod "5616721a-846c-416c-91cf-9275a529232b" (UID: "5616721a-846c-416c-91cf-9275a529232b"). InnerVolumeSpecName "kube-api-access-bhqq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.497974 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5616721a-846c-416c-91cf-9275a529232b" (UID: "5616721a-846c-416c-91cf-9275a529232b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.565027 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.565316 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqq9\" (UniqueName: \"kubernetes.io/projected/5616721a-846c-416c-91cf-9275a529232b-kube-api-access-bhqq9\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.567675 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-config-data" (OuterVolumeSpecName: "config-data") pod "5616721a-846c-416c-91cf-9275a529232b" (UID: "5616721a-846c-416c-91cf-9275a529232b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.667026 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5616721a-846c-416c-91cf-9275a529232b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.905179 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-xwjqn" event={"ID":"5616721a-846c-416c-91cf-9275a529232b","Type":"ContainerDied","Data":"3f92e582cab505dee056567f045c74eb419197c0504dc5ce2135fa2fc3b0ce28"} Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.905222 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f92e582cab505dee056567f045c74eb419197c0504dc5ce2135fa2fc3b0ce28" Dec 01 11:50:01 crc kubenswrapper[4958]: I1201 11:50:01.905254 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-xwjqn" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.172264 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-bcdcd649d-27jl2"] Dec 01 11:50:03 crc kubenswrapper[4958]: E1201 11:50:03.173069 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5616721a-846c-416c-91cf-9275a529232b" containerName="heat-db-sync" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.173083 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5616721a-846c-416c-91cf-9275a529232b" containerName="heat-db-sync" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.173289 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5616721a-846c-416c-91cf-9275a529232b" containerName="heat-db-sync" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.174080 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.179480 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.179648 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.179747 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-6wp7q" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.217009 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-bcdcd649d-27jl2"] Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.262049 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-config-data-custom\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.262332 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-combined-ca-bundle\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.262550 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6fs\" (UniqueName: \"kubernetes.io/projected/6e7fe406-6242-473c-b7c5-cc1a32a276dd-kube-api-access-kd6fs\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.262638 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-config-data\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.313361 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5b6459cc5d-4vd4c"] Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.315308 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.320239 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.321857 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b6459cc5d-4vd4c"] Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.339757 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-78fb45d99c-fzz4f"] Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.342317 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.346252 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.362696 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-78fb45d99c-fzz4f"] Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-combined-ca-bundle\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368306 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj4vs\" (UniqueName: \"kubernetes.io/projected/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-kube-api-access-fj4vs\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368398 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-combined-ca-bundle\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368590 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6fs\" (UniqueName: \"kubernetes.io/projected/6e7fe406-6242-473c-b7c5-cc1a32a276dd-kube-api-access-kd6fs\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368690 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-config-data\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368747 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-config-data\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.368784 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-config-data-custom\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.369230 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-config-data-custom\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.369284 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkj5z\" (UniqueName: \"kubernetes.io/projected/5293b4ea-1edc-408f-b345-1e342d98f9cb-kube-api-access-wkj5z\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.369344 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-combined-ca-bundle\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.369504 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-config-data-custom\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.369600 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-config-data\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.375250 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-config-data-custom\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.377794 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-combined-ca-bundle\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.383093 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e7fe406-6242-473c-b7c5-cc1a32a276dd-config-data\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.410746 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6fs\" (UniqueName: \"kubernetes.io/projected/6e7fe406-6242-473c-b7c5-cc1a32a276dd-kube-api-access-kd6fs\") pod \"heat-engine-bcdcd649d-27jl2\" (UID: \"6e7fe406-6242-473c-b7c5-cc1a32a276dd\") " pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473254 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-config-data\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473317 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-config-data-custom\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkj5z\" (UniqueName: \"kubernetes.io/projected/5293b4ea-1edc-408f-b345-1e342d98f9cb-kube-api-access-wkj5z\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473461 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-combined-ca-bundle\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473527 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-config-data-custom\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473558 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-config-data\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473613 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj4vs\" (UniqueName: \"kubernetes.io/projected/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-kube-api-access-fj4vs\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.473661 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-combined-ca-bundle\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.478168 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-combined-ca-bundle\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.478760 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-config-data-custom\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.479731 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-config-data\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.490693 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-combined-ca-bundle\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.491135 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5293b4ea-1edc-408f-b345-1e342d98f9cb-config-data-custom\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.493605 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-config-data\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.493987 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkj5z\" (UniqueName: \"kubernetes.io/projected/5293b4ea-1edc-408f-b345-1e342d98f9cb-kube-api-access-wkj5z\") pod \"heat-api-5b6459cc5d-4vd4c\" (UID: \"5293b4ea-1edc-408f-b345-1e342d98f9cb\") " pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.498121 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.498931 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj4vs\" (UniqueName: \"kubernetes.io/projected/c402ab81-b8b7-46e3-8279-0cb38d1a1cb6-kube-api-access-fj4vs\") pod \"heat-cfnapi-78fb45d99c-fzz4f\" (UID: \"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6\") " pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.637304 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:03 crc kubenswrapper[4958]: I1201 11:50:03.682294 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.089823 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-bcdcd649d-27jl2"] Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.250149 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b6459cc5d-4vd4c"] Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.356426 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-78fb45d99c-fzz4f"] Dec 01 11:50:04 crc kubenswrapper[4958]: W1201 11:50:04.360282 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc402ab81_b8b7_46e3_8279_0cb38d1a1cb6.slice/crio-77d758c13fe88b452e856d8dd7b04519e9b822bcfb8c7a17ba72539e3695468e WatchSource:0}: Error finding container 77d758c13fe88b452e856d8dd7b04519e9b822bcfb8c7a17ba72539e3695468e: Status 404 returned error can't find the container with id 77d758c13fe88b452e856d8dd7b04519e9b822bcfb8c7a17ba72539e3695468e Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.540408 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.947328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" event={"ID":"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6","Type":"ContainerStarted","Data":"77d758c13fe88b452e856d8dd7b04519e9b822bcfb8c7a17ba72539e3695468e"} Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.948704 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6459cc5d-4vd4c" event={"ID":"5293b4ea-1edc-408f-b345-1e342d98f9cb","Type":"ContainerStarted","Data":"bda0b2aab948c4c260da850d8c83080d54a9804d1b97c62645ad4e64a0d92056"} Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.950634 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-bcdcd649d-27jl2" event={"ID":"6e7fe406-6242-473c-b7c5-cc1a32a276dd","Type":"ContainerStarted","Data":"49719ca912c8df8c75aec24e499895a20834d1460a81110b5fa10edb36c17463"} Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.950678 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-bcdcd649d-27jl2" event={"ID":"6e7fe406-6242-473c-b7c5-cc1a32a276dd","Type":"ContainerStarted","Data":"00e8928345e121cb743b47008054e89f593f8175a5a7cc28058029203a177718"} Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.950750 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:04 crc kubenswrapper[4958]: I1201 11:50:04.973648 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-bcdcd649d-27jl2" podStartSLOduration=2.973629343 podStartE2EDuration="2.973629343s" podCreationTimestamp="2025-12-01 11:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:50:04.968090556 +0000 UTC m=+6652.476879593" watchObservedRunningTime="2025-12-01 11:50:04.973629343 +0000 UTC m=+6652.482418380" Dec 01 11:50:05 crc kubenswrapper[4958]: I1201 11:50:05.797939 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:50:05 crc kubenswrapper[4958]: E1201 11:50:05.798655 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:50:06 crc kubenswrapper[4958]: I1201 11:50:06.860931 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-65dfcfb885-6c2cp" Dec 01 11:50:06 crc kubenswrapper[4958]: I1201 11:50:06.987425 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-986f959c9-6gffd"] Dec 01 11:50:06 crc kubenswrapper[4958]: I1201 11:50:06.987678 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon-log" containerID="cri-o://f8487340023bd36ca8fa7628550caf59b7e08a558adb39f3a52beeecc6323087" gracePeriod=30 Dec 01 11:50:06 crc kubenswrapper[4958]: I1201 11:50:06.987810 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" containerID="cri-o://3c1e412f7dbc976857fa1fcc765688ccfafe240da6ecd120429f9b1933a6a5d6" gracePeriod=30 Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.051560 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cad8-account-create-rffzk"] Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.062273 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-473c-account-create-cc2kr"] Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.084552 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-aafc-account-create-w2pcf"] Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.105125 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-473c-account-create-cc2kr"] Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.122022 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-aafc-account-create-w2pcf"] Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.139710 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cad8-account-create-rffzk"] Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.812552 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d863b4-7434-4304-9098-ef73c392c23f" path="/var/lib/kubelet/pods/72d863b4-7434-4304-9098-ef73c392c23f/volumes" Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.813587 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798fedf2-f691-4307-b222-993c0e027046" path="/var/lib/kubelet/pods/798fedf2-f691-4307-b222-993c0e027046/volumes" Dec 01 11:50:07 crc kubenswrapper[4958]: I1201 11:50:07.814286 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea5407f0-84dc-4fec-aec1-c744e117ef1c" path="/var/lib/kubelet/pods/ea5407f0-84dc-4fec-aec1-c744e117ef1c/volumes" Dec 01 11:50:08 crc kubenswrapper[4958]: I1201 11:50:08.018354 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" event={"ID":"c402ab81-b8b7-46e3-8279-0cb38d1a1cb6","Type":"ContainerStarted","Data":"da19c64e8b77b3e83c75b9f3acf04ed84ac90452c7d782e97c3a027c5575ec67"} Dec 01 11:50:08 crc kubenswrapper[4958]: I1201 11:50:08.018476 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:08 crc kubenswrapper[4958]: I1201 11:50:08.020818 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6459cc5d-4vd4c" event={"ID":"5293b4ea-1edc-408f-b345-1e342d98f9cb","Type":"ContainerStarted","Data":"eb02cc4d2913d0bf1e0f4a4efb13448efa228c56e6a9b7d226a1b0f8451c61f8"} Dec 01 11:50:08 crc kubenswrapper[4958]: I1201 11:50:08.021398 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:08 crc kubenswrapper[4958]: I1201 11:50:08.049042 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" podStartSLOduration=2.559262689 podStartE2EDuration="5.049020117s" podCreationTimestamp="2025-12-01 11:50:03 +0000 UTC" firstStartedPulling="2025-12-01 11:50:04.362774053 +0000 UTC m=+6651.871563090" lastFinishedPulling="2025-12-01 11:50:06.852531481 +0000 UTC m=+6654.361320518" observedRunningTime="2025-12-01 11:50:08.044661853 +0000 UTC m=+6655.553450890" watchObservedRunningTime="2025-12-01 11:50:08.049020117 +0000 UTC m=+6655.557809174" Dec 01 11:50:08 crc kubenswrapper[4958]: I1201 11:50:08.070505 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5b6459cc5d-4vd4c" podStartSLOduration=2.485386422 podStartE2EDuration="5.070477476s" podCreationTimestamp="2025-12-01 11:50:03 +0000 UTC" firstStartedPulling="2025-12-01 11:50:04.267519549 +0000 UTC m=+6651.776308586" lastFinishedPulling="2025-12-01 11:50:06.852610593 +0000 UTC m=+6654.361399640" observedRunningTime="2025-12-01 11:50:08.062029046 +0000 UTC m=+6655.570818073" watchObservedRunningTime="2025-12-01 11:50:08.070477476 +0000 UTC m=+6655.579266523" Dec 01 11:50:10 crc kubenswrapper[4958]: I1201 11:50:10.161098 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:56164->10.217.1.110:8080: read: connection reset by peer" Dec 01 11:50:11 crc kubenswrapper[4958]: I1201 11:50:11.057643 4958 generic.go:334] "Generic (PLEG): container finished" podID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerID="3c1e412f7dbc976857fa1fcc765688ccfafe240da6ecd120429f9b1933a6a5d6" exitCode=0 Dec 01 11:50:11 crc kubenswrapper[4958]: I1201 11:50:11.057713 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-986f959c9-6gffd" event={"ID":"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6","Type":"ContainerDied","Data":"3c1e412f7dbc976857fa1fcc765688ccfafe240da6ecd120429f9b1933a6a5d6"} Dec 01 11:50:15 crc kubenswrapper[4958]: I1201 11:50:15.028537 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5b6459cc5d-4vd4c" Dec 01 11:50:15 crc kubenswrapper[4958]: I1201 11:50:15.239702 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-78fb45d99c-fzz4f" Dec 01 11:50:17 crc kubenswrapper[4958]: I1201 11:50:17.104273 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xswp8"] Dec 01 11:50:17 crc kubenswrapper[4958]: I1201 11:50:17.119944 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xswp8"] Dec 01 11:50:17 crc kubenswrapper[4958]: I1201 11:50:17.810897 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f956bfc8-32c9-4b41-877c-f80d520c011b" path="/var/lib/kubelet/pods/f956bfc8-32c9-4b41-877c-f80d520c011b/volumes" Dec 01 11:50:19 crc kubenswrapper[4958]: I1201 11:50:19.618008 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 01 11:50:19 crc kubenswrapper[4958]: I1201 11:50:19.798086 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:50:19 crc kubenswrapper[4958]: E1201 11:50:19.798552 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:50:23 crc kubenswrapper[4958]: I1201 11:50:23.530668 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-bcdcd649d-27jl2" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.100803 4958 scope.go:117] "RemoveContainer" containerID="0f323b244b7fdbf654648873ed12dc2b7f7137c4a35fe1ae5adb12f1e7b9fbd1" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.147108 4958 scope.go:117] "RemoveContainer" containerID="0e4dbb913579e44799bcfd95f28a0a272cd47aeee85f76dcaf8ffcf18e33d4f6" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.181397 4958 scope.go:117] "RemoveContainer" containerID="bf9115d7ab3438479e0b01378564de2247c19ce96890443d01f3774d43973316" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.224141 4958 scope.go:117] "RemoveContainer" containerID="c91efe3c9d3923d2f69c367f80b292f958b5f6877131e3708b7e876c94e6cf2e" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.344533 4958 scope.go:117] "RemoveContainer" containerID="b478729ae7239b1c67a9e1c302ebc35bfb6c533488e0480f346bdf83f7db827b" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.368013 4958 scope.go:117] "RemoveContainer" containerID="a4b2d5dc2a4b399dad04c44e02d31b99c1d70e35410e0cdd9b3fdc0a7890002c" Dec 01 11:50:27 crc kubenswrapper[4958]: I1201 11:50:27.426989 4958 scope.go:117] "RemoveContainer" containerID="f094f2600ff580778a632f53749dc734c2d087e367f8413417ecc16f7c5a5fdf" Dec 01 11:50:29 crc kubenswrapper[4958]: I1201 11:50:29.618344 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-986f959c9-6gffd" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 01 11:50:29 crc kubenswrapper[4958]: I1201 11:50:29.619380 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:50:31 crc kubenswrapper[4958]: I1201 11:50:31.047548 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pdwfr"] Dec 01 11:50:31 crc kubenswrapper[4958]: I1201 11:50:31.056101 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pdwfr"] Dec 01 11:50:31 crc kubenswrapper[4958]: I1201 11:50:31.811197 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f0027d-f2a8-48b3-8e17-cb404a33a2d6" path="/var/lib/kubelet/pods/69f0027d-f2a8-48b3-8e17-cb404a33a2d6/volumes" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.037298 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gmblw"] Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.048973 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gmblw"] Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.507453 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz"] Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.526336 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.536275 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.539918 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz"] Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.627332 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.627395 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4d5\" (UniqueName: \"kubernetes.io/projected/b3c490fa-b355-4f5c-82e4-5df3975a736c-kube-api-access-xg4d5\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.627490 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.729262 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4d5\" (UniqueName: \"kubernetes.io/projected/b3c490fa-b355-4f5c-82e4-5df3975a736c-kube-api-access-xg4d5\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.729303 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.729380 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.730022 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.730161 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.761717 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4d5\" (UniqueName: \"kubernetes.io/projected/b3c490fa-b355-4f5c-82e4-5df3975a736c-kube-api-access-xg4d5\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:32 crc kubenswrapper[4958]: I1201 11:50:32.858376 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:33 crc kubenswrapper[4958]: I1201 11:50:33.380156 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz"] Dec 01 11:50:33 crc kubenswrapper[4958]: I1201 11:50:33.812367 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:50:33 crc kubenswrapper[4958]: I1201 11:50:33.817907 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b385e57-4fa1-4d10-afef-2cd942607c11" path="/var/lib/kubelet/pods/0b385e57-4fa1-4d10-afef-2cd942607c11/volumes" Dec 01 11:50:33 crc kubenswrapper[4958]: E1201 11:50:33.820075 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:50:34 crc kubenswrapper[4958]: I1201 11:50:34.317795 4958 generic.go:334] "Generic (PLEG): container finished" podID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerID="38f9a8d6aef4a3c9946fe6eae5f1be5df23c550ca23f7643588370c4eb4c9068" exitCode=0 Dec 01 11:50:34 crc kubenswrapper[4958]: I1201 11:50:34.317867 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" event={"ID":"b3c490fa-b355-4f5c-82e4-5df3975a736c","Type":"ContainerDied","Data":"38f9a8d6aef4a3c9946fe6eae5f1be5df23c550ca23f7643588370c4eb4c9068"} Dec 01 11:50:34 crc kubenswrapper[4958]: I1201 11:50:34.318151 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" event={"ID":"b3c490fa-b355-4f5c-82e4-5df3975a736c","Type":"ContainerStarted","Data":"44c06326404305f90cfbd1e5aa22d9eb09399be13dac2c83096f243ba0c0ba3b"} Dec 01 11:50:36 crc kubenswrapper[4958]: I1201 11:50:36.342789 4958 generic.go:334] "Generic (PLEG): container finished" podID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerID="b381d1217bb319ea7293cf05250061890debdebe6d0aefdf491c2d72aab6b118" exitCode=0 Dec 01 11:50:36 crc kubenswrapper[4958]: I1201 11:50:36.343323 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" event={"ID":"b3c490fa-b355-4f5c-82e4-5df3975a736c","Type":"ContainerDied","Data":"b381d1217bb319ea7293cf05250061890debdebe6d0aefdf491c2d72aab6b118"} Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.354943 4958 generic.go:334] "Generic (PLEG): container finished" podID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerID="f8487340023bd36ca8fa7628550caf59b7e08a558adb39f3a52beeecc6323087" exitCode=137 Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.355030 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-986f959c9-6gffd" event={"ID":"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6","Type":"ContainerDied","Data":"f8487340023bd36ca8fa7628550caf59b7e08a558adb39f3a52beeecc6323087"} Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.355726 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-986f959c9-6gffd" event={"ID":"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6","Type":"ContainerDied","Data":"8c6c73b5cd95b8a3ee29ad7bd6e9daab1a304514fdf9924d01225f41a9d174cc"} Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.355744 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c6c73b5cd95b8a3ee29ad7bd6e9daab1a304514fdf9924d01225f41a9d174cc" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.359209 4958 generic.go:334] "Generic (PLEG): container finished" podID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerID="2cdae39bdf6af05af8c2305710abe4b583f06ffefceb5a0e3199efca3ced8bce" exitCode=0 Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.359299 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" event={"ID":"b3c490fa-b355-4f5c-82e4-5df3975a736c","Type":"ContainerDied","Data":"2cdae39bdf6af05af8c2305710abe4b583f06ffefceb5a0e3199efca3ced8bce"} Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.429225 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629038 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-logs\") pod \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629086 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhd6z\" (UniqueName: \"kubernetes.io/projected/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-kube-api-access-lhd6z\") pod \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629325 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-horizon-secret-key\") pod \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629392 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-config-data\") pod \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629492 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-scripts\") pod \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\" (UID: \"cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6\") " Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629613 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-logs" (OuterVolumeSpecName: "logs") pod "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" (UID: "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.629942 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.641104 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-kube-api-access-lhd6z" (OuterVolumeSpecName: "kube-api-access-lhd6z") pod "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" (UID: "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6"). InnerVolumeSpecName "kube-api-access-lhd6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.644001 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" (UID: "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.656291 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-config-data" (OuterVolumeSpecName: "config-data") pod "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" (UID: "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.656762 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-scripts" (OuterVolumeSpecName: "scripts") pod "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" (UID: "cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.732266 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.732301 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.732312 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhd6z\" (UniqueName: \"kubernetes.io/projected/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-kube-api-access-lhd6z\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:37 crc kubenswrapper[4958]: I1201 11:50:37.732325 4958 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.372750 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-986f959c9-6gffd" Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.423001 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-986f959c9-6gffd"] Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.439838 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-986f959c9-6gffd"] Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.801321 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.972201 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-util\") pod \"b3c490fa-b355-4f5c-82e4-5df3975a736c\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.972799 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg4d5\" (UniqueName: \"kubernetes.io/projected/b3c490fa-b355-4f5c-82e4-5df3975a736c-kube-api-access-xg4d5\") pod \"b3c490fa-b355-4f5c-82e4-5df3975a736c\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.973686 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-bundle\") pod \"b3c490fa-b355-4f5c-82e4-5df3975a736c\" (UID: \"b3c490fa-b355-4f5c-82e4-5df3975a736c\") " Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.975694 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-bundle" (OuterVolumeSpecName: "bundle") pod "b3c490fa-b355-4f5c-82e4-5df3975a736c" (UID: "b3c490fa-b355-4f5c-82e4-5df3975a736c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.983109 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-util" (OuterVolumeSpecName: "util") pod "b3c490fa-b355-4f5c-82e4-5df3975a736c" (UID: "b3c490fa-b355-4f5c-82e4-5df3975a736c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:50:38 crc kubenswrapper[4958]: I1201 11:50:38.983350 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c490fa-b355-4f5c-82e4-5df3975a736c-kube-api-access-xg4d5" (OuterVolumeSpecName: "kube-api-access-xg4d5") pod "b3c490fa-b355-4f5c-82e4-5df3975a736c" (UID: "b3c490fa-b355-4f5c-82e4-5df3975a736c"). InnerVolumeSpecName "kube-api-access-xg4d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.086644 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.086678 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3c490fa-b355-4f5c-82e4-5df3975a736c-util\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.086688 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg4d5\" (UniqueName: \"kubernetes.io/projected/b3c490fa-b355-4f5c-82e4-5df3975a736c-kube-api-access-xg4d5\") on node \"crc\" DevicePath \"\"" Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.389391 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" event={"ID":"b3c490fa-b355-4f5c-82e4-5df3975a736c","Type":"ContainerDied","Data":"44c06326404305f90cfbd1e5aa22d9eb09399be13dac2c83096f243ba0c0ba3b"} Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.389454 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c06326404305f90cfbd1e5aa22d9eb09399be13dac2c83096f243ba0c0ba3b" Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.389539 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz" Dec 01 11:50:39 crc kubenswrapper[4958]: I1201 11:50:39.814132 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" path="/var/lib/kubelet/pods/cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6/volumes" Dec 01 11:50:46 crc kubenswrapper[4958]: I1201 11:50:46.109449 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5jcd"] Dec 01 11:50:46 crc kubenswrapper[4958]: I1201 11:50:46.117230 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5jcd"] Dec 01 11:50:47 crc kubenswrapper[4958]: I1201 11:50:47.827840 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13e7610-d97c-4701-bfbc-a7e97d3f3909" path="/var/lib/kubelet/pods/c13e7610-d97c-4701-bfbc-a7e97d3f3909/volumes" Dec 01 11:50:48 crc kubenswrapper[4958]: I1201 11:50:48.797986 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:50:48 crc kubenswrapper[4958]: E1201 11:50:48.798659 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.042745 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb"] Dec 01 11:50:51 crc kubenswrapper[4958]: E1201 11:50:51.043820 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.043861 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" Dec 01 11:50:51 crc kubenswrapper[4958]: E1201 11:50:51.043897 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="util" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.043906 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="util" Dec 01 11:50:51 crc kubenswrapper[4958]: E1201 11:50:51.043955 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="pull" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.043966 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="pull" Dec 01 11:50:51 crc kubenswrapper[4958]: E1201 11:50:51.043983 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="extract" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.043992 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="extract" Dec 01 11:50:51 crc kubenswrapper[4958]: E1201 11:50:51.044006 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon-log" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.044014 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon-log" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.044281 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon-log" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.044313 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3d5bd9-1a39-4fc3-b4ff-b27cdfb968b6" containerName="horizon" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.044327 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3c490fa-b355-4f5c-82e4-5df3975a736c" containerName="extract" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.045137 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.047723 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.048102 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-r68dn" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.048346 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.052798 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.113057 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.115501 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.119736 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.119998 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-cxfhj" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.140526 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.142146 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.151742 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb\" (UID: \"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.151853 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb\" (UID: \"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.151904 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcsz4\" (UniqueName: \"kubernetes.io/projected/56cf7323-f1e0-4c22-9890-aa06bf839456-kube-api-access-bcsz4\") pod \"obo-prometheus-operator-668cf9dfbb-bbtwb\" (UID: \"56cf7323-f1e0-4c22-9890-aa06bf839456\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.151967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a3a2039-e66b-4c1f-b95e-0665f82fdd0b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r\" (UID: \"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.151991 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a3a2039-e66b-4c1f-b95e-0665f82fdd0b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r\" (UID: \"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.156137 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.170973 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.253884 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcsz4\" (UniqueName: \"kubernetes.io/projected/56cf7323-f1e0-4c22-9890-aa06bf839456-kube-api-access-bcsz4\") pod \"obo-prometheus-operator-668cf9dfbb-bbtwb\" (UID: \"56cf7323-f1e0-4c22-9890-aa06bf839456\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.253956 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a3a2039-e66b-4c1f-b95e-0665f82fdd0b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r\" (UID: \"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.254001 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a3a2039-e66b-4c1f-b95e-0665f82fdd0b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r\" (UID: \"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.254102 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb\" (UID: \"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.254179 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb\" (UID: \"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.261993 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb\" (UID: \"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.262076 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a3a2039-e66b-4c1f-b95e-0665f82fdd0b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r\" (UID: \"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.262777 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb\" (UID: \"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.269178 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a3a2039-e66b-4c1f-b95e-0665f82fdd0b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r\" (UID: \"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.299924 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qqt7d"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.301708 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.303627 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcsz4\" (UniqueName: \"kubernetes.io/projected/56cf7323-f1e0-4c22-9890-aa06bf839456-kube-api-access-bcsz4\") pod \"obo-prometheus-operator-668cf9dfbb-bbtwb\" (UID: \"56cf7323-f1e0-4c22-9890-aa06bf839456\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.306633 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-52nn2" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.306874 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.322487 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qqt7d"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.357210 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv262\" (UniqueName: \"kubernetes.io/projected/84673714-0a83-4e97-bbfb-bd5bf7716f71-kube-api-access-jv262\") pod \"observability-operator-d8bb48f5d-qqt7d\" (UID: \"84673714-0a83-4e97-bbfb-bd5bf7716f71\") " pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.357274 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/84673714-0a83-4e97-bbfb-bd5bf7716f71-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qqt7d\" (UID: \"84673714-0a83-4e97-bbfb-bd5bf7716f71\") " pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.366236 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.444454 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.465451 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv262\" (UniqueName: \"kubernetes.io/projected/84673714-0a83-4e97-bbfb-bd5bf7716f71-kube-api-access-jv262\") pod \"observability-operator-d8bb48f5d-qqt7d\" (UID: \"84673714-0a83-4e97-bbfb-bd5bf7716f71\") " pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.465536 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/84673714-0a83-4e97-bbfb-bd5bf7716f71-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qqt7d\" (UID: \"84673714-0a83-4e97-bbfb-bd5bf7716f71\") " pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.466397 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.479126 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/84673714-0a83-4e97-bbfb-bd5bf7716f71-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qqt7d\" (UID: \"84673714-0a83-4e97-bbfb-bd5bf7716f71\") " pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.493294 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-wddt7"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.494613 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv262\" (UniqueName: \"kubernetes.io/projected/84673714-0a83-4e97-bbfb-bd5bf7716f71-kube-api-access-jv262\") pod \"observability-operator-d8bb48f5d-qqt7d\" (UID: \"84673714-0a83-4e97-bbfb-bd5bf7716f71\") " pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.495239 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.498564 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-lhmx5" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.557743 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-wddt7"] Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.567522 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6969z\" (UniqueName: \"kubernetes.io/projected/e1214a3e-b12c-4cf4-8230-8fb09334635e-kube-api-access-6969z\") pod \"perses-operator-5446b9c989-wddt7\" (UID: \"e1214a3e-b12c-4cf4-8230-8fb09334635e\") " pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.567608 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1214a3e-b12c-4cf4-8230-8fb09334635e-openshift-service-ca\") pod \"perses-operator-5446b9c989-wddt7\" (UID: \"e1214a3e-b12c-4cf4-8230-8fb09334635e\") " pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.679661 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.683586 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6969z\" (UniqueName: \"kubernetes.io/projected/e1214a3e-b12c-4cf4-8230-8fb09334635e-kube-api-access-6969z\") pod \"perses-operator-5446b9c989-wddt7\" (UID: \"e1214a3e-b12c-4cf4-8230-8fb09334635e\") " pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.683680 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1214a3e-b12c-4cf4-8230-8fb09334635e-openshift-service-ca\") pod \"perses-operator-5446b9c989-wddt7\" (UID: \"e1214a3e-b12c-4cf4-8230-8fb09334635e\") " pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.686728 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e1214a3e-b12c-4cf4-8230-8fb09334635e-openshift-service-ca\") pod \"perses-operator-5446b9c989-wddt7\" (UID: \"e1214a3e-b12c-4cf4-8230-8fb09334635e\") " pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.714450 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6969z\" (UniqueName: \"kubernetes.io/projected/e1214a3e-b12c-4cf4-8230-8fb09334635e-kube-api-access-6969z\") pod \"perses-operator-5446b9c989-wddt7\" (UID: \"e1214a3e-b12c-4cf4-8230-8fb09334635e\") " pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:51 crc kubenswrapper[4958]: I1201 11:50:51.846759 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.072220 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb"] Dec 01 11:50:52 crc kubenswrapper[4958]: W1201 11:50:52.257898 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ecfd6a8_cdee_4d7a_8a4b_9158b7d91b05.slice/crio-9764a0a29b857b53d1fb6db457f2dd872cfa677a623c70635b4b97afda3dc6e6 WatchSource:0}: Error finding container 9764a0a29b857b53d1fb6db457f2dd872cfa677a623c70635b4b97afda3dc6e6: Status 404 returned error can't find the container with id 9764a0a29b857b53d1fb6db457f2dd872cfa677a623c70635b4b97afda3dc6e6 Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.259780 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb"] Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.472967 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r"] Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.500433 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qqt7d"] Dec 01 11:50:52 crc kubenswrapper[4958]: W1201 11:50:52.508615 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84673714_0a83_4e97_bbfb_bd5bf7716f71.slice/crio-49813cfaea156296c25442dadd5edda72c147be37d9e31d7d7c7b91ec9b7fce4 WatchSource:0}: Error finding container 49813cfaea156296c25442dadd5edda72c147be37d9e31d7d7c7b91ec9b7fce4: Status 404 returned error can't find the container with id 49813cfaea156296c25442dadd5edda72c147be37d9e31d7d7c7b91ec9b7fce4 Dec 01 11:50:52 crc kubenswrapper[4958]: W1201 11:50:52.658252 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1214a3e_b12c_4cf4_8230_8fb09334635e.slice/crio-334042e6b176e52f2353c497e64d96b01319c73557e56a0d47d36b6007959f1c WatchSource:0}: Error finding container 334042e6b176e52f2353c497e64d96b01319c73557e56a0d47d36b6007959f1c: Status 404 returned error can't find the container with id 334042e6b176e52f2353c497e64d96b01319c73557e56a0d47d36b6007959f1c Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.672422 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-wddt7"] Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.812417 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" event={"ID":"56cf7323-f1e0-4c22-9890-aa06bf839456","Type":"ContainerStarted","Data":"3253039cc77a89a223b0a7d912f69c7a1ee3c0e578cf2de381ad4fcb5d42c0d8"} Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.814187 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-wddt7" event={"ID":"e1214a3e-b12c-4cf4-8230-8fb09334635e","Type":"ContainerStarted","Data":"334042e6b176e52f2353c497e64d96b01319c73557e56a0d47d36b6007959f1c"} Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.816363 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" event={"ID":"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b","Type":"ContainerStarted","Data":"1090053c4a8f20dc284db0c20fd02463dabb23091341dd971fee5b441c40cf8e"} Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.818036 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" event={"ID":"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05","Type":"ContainerStarted","Data":"9764a0a29b857b53d1fb6db457f2dd872cfa677a623c70635b4b97afda3dc6e6"} Dec 01 11:50:52 crc kubenswrapper[4958]: I1201 11:50:52.819706 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" event={"ID":"84673714-0a83-4e97-bbfb-bd5bf7716f71","Type":"ContainerStarted","Data":"49813cfaea156296c25442dadd5edda72c147be37d9e31d7d7c7b91ec9b7fce4"} Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.244601 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.249800 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.259057 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.326530 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-catalog-content\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.326766 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-utilities\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.326922 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4f77\" (UniqueName: \"kubernetes.io/projected/44224579-1880-4630-8609-f8fb6ab8cb92-kube-api-access-x4f77\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.428911 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-utilities\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.429014 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4f77\" (UniqueName: \"kubernetes.io/projected/44224579-1880-4630-8609-f8fb6ab8cb92-kube-api-access-x4f77\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.429080 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-catalog-content\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.429609 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-catalog-content\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.429889 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-utilities\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.455046 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4f77\" (UniqueName: \"kubernetes.io/projected/44224579-1880-4630-8609-f8fb6ab8cb92-kube-api-access-x4f77\") pod \"community-operators-z2r66\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:53 crc kubenswrapper[4958]: I1201 11:50:53.592899 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:50:54 crc kubenswrapper[4958]: I1201 11:50:54.390476 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 11:50:54 crc kubenswrapper[4958]: I1201 11:50:54.864665 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerStarted","Data":"4987a0ddd8f90e91559513a2212dd3a2cba6ed86cd94952665f698e7050140f5"} Dec 01 11:50:54 crc kubenswrapper[4958]: I1201 11:50:54.865001 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerStarted","Data":"f245fbf9419b1cc58b8eb84a3565046b29691a82216733adafe08f0b95abbad8"} Dec 01 11:50:55 crc kubenswrapper[4958]: I1201 11:50:55.928667 4958 generic.go:334] "Generic (PLEG): container finished" podID="44224579-1880-4630-8609-f8fb6ab8cb92" containerID="4987a0ddd8f90e91559513a2212dd3a2cba6ed86cd94952665f698e7050140f5" exitCode=0 Dec 01 11:50:55 crc kubenswrapper[4958]: I1201 11:50:55.929341 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerDied","Data":"4987a0ddd8f90e91559513a2212dd3a2cba6ed86cd94952665f698e7050140f5"} Dec 01 11:51:01 crc kubenswrapper[4958]: I1201 11:51:01.797626 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.035783 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" event={"ID":"84673714-0a83-4e97-bbfb-bd5bf7716f71","Type":"ContainerStarted","Data":"857af2d80ff695e9e220b97df846aea4667e14afe10c45dcf8d98f0b5d6c90a7"} Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.036408 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.041052 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.041092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"f2f66a236e3c207fc7e3f3fd8d2b5bb72d1175a1c971bcc5a6d41f218a93b853"} Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.043202 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-wddt7" event={"ID":"e1214a3e-b12c-4cf4-8230-8fb09334635e","Type":"ContainerStarted","Data":"02612938d29871fcece64e34fa86bad70c0998a3c96c3af6330dacb246466fdc"} Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.043793 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.045454 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" event={"ID":"56cf7323-f1e0-4c22-9890-aa06bf839456","Type":"ContainerStarted","Data":"baf26f5e45335a8aef257659189e82bb47a947638f1d47e39f8c629ebf61eb45"} Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.047889 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" event={"ID":"4a3a2039-e66b-4c1f-b95e-0665f82fdd0b","Type":"ContainerStarted","Data":"78ba4c608421c00762aa678fc6023d87ca6d7fd74bc8cc2331803df034078278"} Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.051415 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" event={"ID":"3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05","Type":"ContainerStarted","Data":"dd4358846518a23d46a8ba34e13b55a9a79126184a4dd33534c712a7e2873df1"} Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.069661 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-qqt7d" podStartSLOduration=2.491641861 podStartE2EDuration="13.069631575s" podCreationTimestamp="2025-12-01 11:50:51 +0000 UTC" firstStartedPulling="2025-12-01 11:50:52.521694947 +0000 UTC m=+6700.030483984" lastFinishedPulling="2025-12-01 11:51:03.099684661 +0000 UTC m=+6710.608473698" observedRunningTime="2025-12-01 11:51:04.056919175 +0000 UTC m=+6711.565708212" watchObservedRunningTime="2025-12-01 11:51:04.069631575 +0000 UTC m=+6711.578420612" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.166049 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-wddt7" podStartSLOduration=2.786677167 podStartE2EDuration="13.166033802s" podCreationTimestamp="2025-12-01 11:50:51 +0000 UTC" firstStartedPulling="2025-12-01 11:50:52.661256649 +0000 UTC m=+6700.170045686" lastFinishedPulling="2025-12-01 11:51:03.040613284 +0000 UTC m=+6710.549402321" observedRunningTime="2025-12-01 11:51:04.139365565 +0000 UTC m=+6711.648154602" watchObservedRunningTime="2025-12-01 11:51:04.166033802 +0000 UTC m=+6711.674822839" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.175563 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r" podStartSLOduration=2.624555385 podStartE2EDuration="13.175545242s" podCreationTimestamp="2025-12-01 11:50:51 +0000 UTC" firstStartedPulling="2025-12-01 11:50:52.488453174 +0000 UTC m=+6699.997242211" lastFinishedPulling="2025-12-01 11:51:03.039443031 +0000 UTC m=+6710.548232068" observedRunningTime="2025-12-01 11:51:04.165518977 +0000 UTC m=+6711.674308014" watchObservedRunningTime="2025-12-01 11:51:04.175545242 +0000 UTC m=+6711.684334279" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.238962 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb" podStartSLOduration=2.454916179 podStartE2EDuration="13.238943112s" podCreationTimestamp="2025-12-01 11:50:51 +0000 UTC" firstStartedPulling="2025-12-01 11:50:52.273206943 +0000 UTC m=+6699.781995980" lastFinishedPulling="2025-12-01 11:51:03.057233876 +0000 UTC m=+6710.566022913" observedRunningTime="2025-12-01 11:51:04.227042274 +0000 UTC m=+6711.735831311" watchObservedRunningTime="2025-12-01 11:51:04.238943112 +0000 UTC m=+6711.747732149" Dec 01 11:51:04 crc kubenswrapper[4958]: I1201 11:51:04.299279 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-bbtwb" podStartSLOduration=2.340360277 podStartE2EDuration="13.299255954s" podCreationTimestamp="2025-12-01 11:50:51 +0000 UTC" firstStartedPulling="2025-12-01 11:50:52.096793375 +0000 UTC m=+6699.605582412" lastFinishedPulling="2025-12-01 11:51:03.055689052 +0000 UTC m=+6710.564478089" observedRunningTime="2025-12-01 11:51:04.294324924 +0000 UTC m=+6711.803113971" watchObservedRunningTime="2025-12-01 11:51:04.299255954 +0000 UTC m=+6711.808044991" Dec 01 11:51:11 crc kubenswrapper[4958]: I1201 11:51:11.849830 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-wddt7" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.195091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerStarted","Data":"567553f8a6a139dc32820fdece06ddebb736e5085742a1421bc951d7bedbc4d3"} Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.480594 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.480884 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" containerName="openstackclient" containerID="cri-o://4a4ea7fe5561c436cc39172b8f7674a704670cf922c5ebf27c16cc15a6f80bb2" gracePeriod=2 Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.494338 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.623884 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: E1201 11:51:14.624638 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" containerName="openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.624652 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" containerName="openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.624898 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" containerName="openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.625611 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.647465 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" podUID="8b407506-fd21-473f-b33f-497c67f512b2" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.685586 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.712406 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: E1201 11:51:14.713359 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ds99v openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="8b407506-fd21-473f-b33f-497c67f512b2" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.721908 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.748938 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.750666 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.761089 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.811838 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8b407506-fd21-473f-b33f-497c67f512b2" podUID="2d7ca993-e5bd-450d-aa84-09be723e1764" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.821999 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.822215 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.822346 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds99v\" (UniqueName: \"kubernetes.io/projected/8b407506-fd21-473f-b33f-497c67f512b2-kube-api-access-ds99v\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.925309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d7ca993-e5bd-450d-aa84-09be723e1764-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.925359 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d7ca993-e5bd-450d-aa84-09be723e1764-openstack-config\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.925429 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.927078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.927147 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.927196 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vl77\" (UniqueName: \"kubernetes.io/projected/2d7ca993-e5bd-450d-aa84-09be723e1764-kube-api-access-9vl77\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.927302 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds99v\" (UniqueName: \"kubernetes.io/projected/8b407506-fd21-473f-b33f-497c67f512b2-kube-api-access-ds99v\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:14 crc kubenswrapper[4958]: E1201 11:51:14.931129 4958 projected.go:194] Error preparing data for projected volume kube-api-access-ds99v for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (8b407506-fd21-473f-b33f-497c67f512b2) does not match the UID in record. The object might have been deleted and then recreated Dec 01 11:51:14 crc kubenswrapper[4958]: E1201 11:51:14.931276 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8b407506-fd21-473f-b33f-497c67f512b2-kube-api-access-ds99v podName:8b407506-fd21-473f-b33f-497c67f512b2 nodeName:}" failed. No retries permitted until 2025-12-01 11:51:15.431242571 +0000 UTC m=+6722.940031608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ds99v" (UniqueName: "kubernetes.io/projected/8b407506-fd21-473f-b33f-497c67f512b2-kube-api-access-ds99v") pod "openstackclient" (UID: "8b407506-fd21-473f-b33f-497c67f512b2") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (8b407506-fd21-473f-b33f-497c67f512b2) does not match the UID in record. The object might have been deleted and then recreated Dec 01 11:51:14 crc kubenswrapper[4958]: I1201 11:51:14.964412 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config-secret\") pod \"openstackclient\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.011333 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.013458 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.018474 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8t5x8" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.029027 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d7ca993-e5bd-450d-aa84-09be723e1764-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.029081 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d7ca993-e5bd-450d-aa84-09be723e1764-openstack-config\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.029149 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwsl\" (UniqueName: \"kubernetes.io/projected/bf17853a-4517-4e36-98b1-8d69f5c94af3-kube-api-access-mgwsl\") pod \"kube-state-metrics-0\" (UID: \"bf17853a-4517-4e36-98b1-8d69f5c94af3\") " pod="openstack/kube-state-metrics-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.029219 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vl77\" (UniqueName: \"kubernetes.io/projected/2d7ca993-e5bd-450d-aa84-09be723e1764-kube-api-access-9vl77\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.030908 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d7ca993-e5bd-450d-aa84-09be723e1764-openstack-config\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.033110 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.045154 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d7ca993-e5bd-450d-aa84-09be723e1764-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.068703 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vl77\" (UniqueName: \"kubernetes.io/projected/2d7ca993-e5bd-450d-aa84-09be723e1764-kube-api-access-9vl77\") pod \"openstackclient\" (UID: \"2d7ca993-e5bd-450d-aa84-09be723e1764\") " pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.129998 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwsl\" (UniqueName: \"kubernetes.io/projected/bf17853a-4517-4e36-98b1-8d69f5c94af3-kube-api-access-mgwsl\") pod \"kube-state-metrics-0\" (UID: \"bf17853a-4517-4e36-98b1-8d69f5c94af3\") " pod="openstack/kube-state-metrics-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.151606 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.207195 4958 generic.go:334] "Generic (PLEG): container finished" podID="44224579-1880-4630-8609-f8fb6ab8cb92" containerID="567553f8a6a139dc32820fdece06ddebb736e5085742a1421bc951d7bedbc4d3" exitCode=0 Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.207274 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.208814 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerDied","Data":"567553f8a6a139dc32820fdece06ddebb736e5085742a1421bc951d7bedbc4d3"} Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.223276 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.343472 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config-secret\") pod \"8b407506-fd21-473f-b33f-497c67f512b2\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.343805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config\") pod \"8b407506-fd21-473f-b33f-497c67f512b2\" (UID: \"8b407506-fd21-473f-b33f-497c67f512b2\") " Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.351650 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8b407506-fd21-473f-b33f-497c67f512b2" (UID: "8b407506-fd21-473f-b33f-497c67f512b2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.351998 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8b407506-fd21-473f-b33f-497c67f512b2" podUID="2d7ca993-e5bd-450d-aa84-09be723e1764" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.353654 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwsl\" (UniqueName: \"kubernetes.io/projected/bf17853a-4517-4e36-98b1-8d69f5c94af3-kube-api-access-mgwsl\") pod \"kube-state-metrics-0\" (UID: \"bf17853a-4517-4e36-98b1-8d69f5c94af3\") " pod="openstack/kube-state-metrics-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.354339 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8b407506-fd21-473f-b33f-497c67f512b2" (UID: "8b407506-fd21-473f-b33f-497c67f512b2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.355891 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.355942 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8b407506-fd21-473f-b33f-497c67f512b2-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.355961 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds99v\" (UniqueName: \"kubernetes.io/projected/8b407506-fd21-473f-b33f-497c67f512b2-kube-api-access-ds99v\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.652658 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.828182 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b407506-fd21-473f-b33f-497c67f512b2" path="/var/lib/kubelet/pods/8b407506-fd21-473f-b33f-497c67f512b2/volumes" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.829393 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.872703 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.872873 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.881360 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.881360 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.881377 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.881556 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-x6bs8" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.881748 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 01 11:51:15 crc kubenswrapper[4958]: I1201 11:51:15.988588 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090341 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090415 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090450 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090495 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090643 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090681 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.090710 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4mz\" (UniqueName: \"kubernetes.io/projected/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-kube-api-access-6r4mz\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.184638 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.192263 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.192334 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.192382 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4mz\" (UniqueName: \"kubernetes.io/projected/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-kube-api-access-6r4mz\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.192486 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.192512 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.192549 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.193228 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.199420 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.201700 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.205220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.209191 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.223770 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.231776 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4mz\" (UniqueName: \"kubernetes.io/projected/59f9a43a-9d58-4efb-ad19-9d1e4b632fdc-kube-api-access-6r4mz\") pod \"alertmanager-metric-storage-0\" (UID: \"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc\") " pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.236683 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8b407506-fd21-473f-b33f-497c67f512b2" podUID="2d7ca993-e5bd-450d-aa84-09be723e1764" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.503243 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8b407506-fd21-473f-b33f-497c67f512b2" podUID="2d7ca993-e5bd-450d-aa84-09be723e1764" Dec 01 11:51:16 crc kubenswrapper[4958]: I1201 11:51:16.524631 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:17 crc kubenswrapper[4958]: I1201 11:51:17.094090 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 11:51:17 crc kubenswrapper[4958]: I1201 11:51:17.268475 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf17853a-4517-4e36-98b1-8d69f5c94af3","Type":"ContainerStarted","Data":"d2d9700e96a090cbb83578a7d1263aa9cae48ac99d00dae7278a5a1df0abcc8d"} Dec 01 11:51:17 crc kubenswrapper[4958]: I1201 11:51:17.283580 4958 generic.go:334] "Generic (PLEG): container finished" podID="c556ea7c-505e-4575-a0cf-f313ec8058b1" containerID="4a4ea7fe5561c436cc39172b8f7674a704670cf922c5ebf27c16cc15a6f80bb2" exitCode=137 Dec 01 11:51:17 crc kubenswrapper[4958]: I1201 11:51:17.398343 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 11:51:17 crc kubenswrapper[4958]: W1201 11:51:17.412061 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d7ca993_e5bd_450d_aa84_09be723e1764.slice/crio-afc6f23829a6bbbdcaa05d8b5842cc323f2a10f3e6907ec2d1963ebf4bbb6315 WatchSource:0}: Error finding container afc6f23829a6bbbdcaa05d8b5842cc323f2a10f3e6907ec2d1963ebf4bbb6315: Status 404 returned error can't find the container with id afc6f23829a6bbbdcaa05d8b5842cc323f2a10f3e6907ec2d1963ebf4bbb6315 Dec 01 11:51:17 crc kubenswrapper[4958]: I1201 11:51:17.657007 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.260273 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.360725 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.362220 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.366286 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.367242 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.368246 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.375319 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nbgpw" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.376522 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.381091 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.385553 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63f4e940c78bf3afb62cbea024ff5b16cff2ef31695cd61afb2d9c70b5891ff0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.402460 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc","Type":"ContainerStarted","Data":"17a4f44cfed2e31a8f5dc6eec3c63b63041425351c80dd2fd379a7f725ff757f"} Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.440395 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerStarted","Data":"c076ffd3f5b7cb4fe0b74c0ef17a568c6d1d1f1579ae1d6975d5daaf31c1924b"} Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.457382 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2d7ca993-e5bd-450d-aa84-09be723e1764","Type":"ContainerStarted","Data":"afc6f23829a6bbbdcaa05d8b5842cc323f2a10f3e6907ec2d1963ebf4bbb6315"} Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.476934 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.542067 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z2r66" podStartSLOduration=6.513459418 podStartE2EDuration="25.542043833s" podCreationTimestamp="2025-12-01 11:50:53 +0000 UTC" firstStartedPulling="2025-12-01 11:50:57.835529465 +0000 UTC m=+6705.344318502" lastFinishedPulling="2025-12-01 11:51:16.86411388 +0000 UTC m=+6724.372902917" observedRunningTime="2025-12-01 11:51:18.520288795 +0000 UTC m=+6726.029077832" watchObservedRunningTime="2025-12-01 11:51:18.542043833 +0000 UTC m=+6726.050832870" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558074 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558163 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-config\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558198 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558231 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558331 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvnnh\" (UniqueName: \"kubernetes.io/projected/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-kube-api-access-pvnnh\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558569 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558684 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.558870 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.660288 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config-secret\") pod \"c556ea7c-505e-4575-a0cf-f313ec8058b1\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.660962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config\") pod \"c556ea7c-505e-4575-a0cf-f313ec8058b1\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.661139 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqgng\" (UniqueName: \"kubernetes.io/projected/c556ea7c-505e-4575-a0cf-f313ec8058b1-kube-api-access-mqgng\") pod \"c556ea7c-505e-4575-a0cf-f313ec8058b1\" (UID: \"c556ea7c-505e-4575-a0cf-f313ec8058b1\") " Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.661653 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.661885 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.662095 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-config\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.665184 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.665339 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.665471 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvnnh\" (UniqueName: \"kubernetes.io/projected/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-kube-api-access-pvnnh\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.665663 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.665836 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.665111 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.668957 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c556ea7c-505e-4575-a0cf-f313ec8058b1-kube-api-access-mqgng" (OuterVolumeSpecName: "kube-api-access-mqgng") pod "c556ea7c-505e-4575-a0cf-f313ec8058b1" (UID: "c556ea7c-505e-4575-a0cf-f313ec8058b1"). InnerVolumeSpecName "kube-api-access-mqgng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.685654 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.685982 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-config\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.686464 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.686892 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.687045 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.696686 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvnnh\" (UniqueName: \"kubernetes.io/projected/774c8c4e-7ee4-4fb0-ad55-0894fa086b49-kube-api-access-pvnnh\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.710312 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c556ea7c-505e-4575-a0cf-f313ec8058b1" (UID: "c556ea7c-505e-4575-a0cf-f313ec8058b1"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.710921 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.710964 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e31e0b5cc49d9d9c5fc3732c9516c4879c42259dbfdd726688a82d570fc235f1/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.768657 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.768700 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqgng\" (UniqueName: \"kubernetes.io/projected/c556ea7c-505e-4575-a0cf-f313ec8058b1-kube-api-access-mqgng\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.795673 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13d40b5e-bc18-4b97-b9bc-70a472934f41\") pod \"prometheus-metric-storage-0\" (UID: \"774c8c4e-7ee4-4fb0-ad55-0894fa086b49\") " pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.797913 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c556ea7c-505e-4575-a0cf-f313ec8058b1" (UID: "c556ea7c-505e-4575-a0cf-f313ec8058b1"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.871452 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c556ea7c-505e-4575-a0cf-f313ec8058b1-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:18 crc kubenswrapper[4958]: I1201 11:51:18.891065 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.479293 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf17853a-4517-4e36-98b1-8d69f5c94af3","Type":"ContainerStarted","Data":"62a5d445758765a7599a38a6cee8fc5de513b8791eae540d64682eb5537fb36d"} Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.481030 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.483944 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.493620 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2d7ca993-e5bd-450d-aa84-09be723e1764","Type":"ContainerStarted","Data":"0884f55b852ddb36b22b808cc72eaa2a8797782d3435cc44e4c81082a526c0a9"} Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.549589 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=4.83646482 podStartE2EDuration="5.549565364s" podCreationTimestamp="2025-12-01 11:51:14 +0000 UTC" firstStartedPulling="2025-12-01 11:51:17.133468096 +0000 UTC m=+6724.642257123" lastFinishedPulling="2025-12-01 11:51:17.84656863 +0000 UTC m=+6725.355357667" observedRunningTime="2025-12-01 11:51:19.545270832 +0000 UTC m=+6727.054059869" watchObservedRunningTime="2025-12-01 11:51:19.549565364 +0000 UTC m=+6727.058354391" Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.552187 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" podUID="2d7ca993-e5bd-450d-aa84-09be723e1764" Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.585255 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=5.585228626 podStartE2EDuration="5.585228626s" podCreationTimestamp="2025-12-01 11:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:51:19.574355038 +0000 UTC m=+6727.083144075" watchObservedRunningTime="2025-12-01 11:51:19.585228626 +0000 UTC m=+6727.094017663" Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.629883 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 01 11:51:19 crc kubenswrapper[4958]: I1201 11:51:19.818653 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c556ea7c-505e-4575-a0cf-f313ec8058b1" path="/var/lib/kubelet/pods/c556ea7c-505e-4575-a0cf-f313ec8058b1/volumes" Dec 01 11:51:20 crc kubenswrapper[4958]: I1201 11:51:20.502658 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"774c8c4e-7ee4-4fb0-ad55-0894fa086b49","Type":"ContainerStarted","Data":"f482f9a415e17a731cbebed41d8d6b87cf693d5f52b41827af4d25b7559c98c6"} Dec 01 11:51:23 crc kubenswrapper[4958]: I1201 11:51:23.594057 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:51:23 crc kubenswrapper[4958]: I1201 11:51:23.594613 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:51:24 crc kubenswrapper[4958]: I1201 11:51:24.703257 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-z2r66" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="registry-server" probeResult="failure" output=< Dec 01 11:51:24 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:51:24 crc kubenswrapper[4958]: > Dec 01 11:51:25 crc kubenswrapper[4958]: I1201 11:51:25.727497 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"774c8c4e-7ee4-4fb0-ad55-0894fa086b49","Type":"ContainerStarted","Data":"3db19f7765187b481852bd4c0413fef848216f9f42e46e1d977eba51a2e4dd6c"} Dec 01 11:51:25 crc kubenswrapper[4958]: I1201 11:51:25.890001 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 11:51:26 crc kubenswrapper[4958]: I1201 11:51:26.747340 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc","Type":"ContainerStarted","Data":"724ddb3b938640a7d43bfd5c43003cd79a964b70c8cf52c5590bd540fe48cc27"} Dec 01 11:51:27 crc kubenswrapper[4958]: I1201 11:51:27.589360 4958 scope.go:117] "RemoveContainer" containerID="13d722706dcbd75e260252df7ceb9ed529d78680913a5efd0945b3d898eb8b49" Dec 01 11:51:27 crc kubenswrapper[4958]: I1201 11:51:27.657618 4958 scope.go:117] "RemoveContainer" containerID="f3f7f03420186ef17424c6d5486d41ed7a7f0baa119081e8674280b9b7f9c82f" Dec 01 11:51:27 crc kubenswrapper[4958]: I1201 11:51:27.720180 4958 scope.go:117] "RemoveContainer" containerID="215f0616431a98c2cb2fc8d89ae596aab07c48861a10b2ea35a33058e8dcc2ab" Dec 01 11:51:27 crc kubenswrapper[4958]: I1201 11:51:27.789370 4958 scope.go:117] "RemoveContainer" containerID="66ebdcb1705867395e3ae0597e83a9c008d70ada8f0c9c6a59369677354b3847" Dec 01 11:51:27 crc kubenswrapper[4958]: I1201 11:51:27.862123 4958 scope.go:117] "RemoveContainer" containerID="4a4ea7fe5561c436cc39172b8f7674a704670cf922c5ebf27c16cc15a6f80bb2" Dec 01 11:51:31 crc kubenswrapper[4958]: I1201 11:51:31.036398 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ttq72"] Dec 01 11:51:31 crc kubenswrapper[4958]: I1201 11:51:31.045621 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ttq72"] Dec 01 11:51:31 crc kubenswrapper[4958]: I1201 11:51:31.815977 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492cddee-1cc4-49b1-8be6-078adc4fa108" path="/var/lib/kubelet/pods/492cddee-1cc4-49b1-8be6-078adc4fa108/volumes" Dec 01 11:51:32 crc kubenswrapper[4958]: I1201 11:51:32.824363 4958 generic.go:334] "Generic (PLEG): container finished" podID="59f9a43a-9d58-4efb-ad19-9d1e4b632fdc" containerID="724ddb3b938640a7d43bfd5c43003cd79a964b70c8cf52c5590bd540fe48cc27" exitCode=0 Dec 01 11:51:32 crc kubenswrapper[4958]: I1201 11:51:32.824449 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc","Type":"ContainerDied","Data":"724ddb3b938640a7d43bfd5c43003cd79a964b70c8cf52c5590bd540fe48cc27"} Dec 01 11:51:32 crc kubenswrapper[4958]: I1201 11:51:32.828073 4958 generic.go:334] "Generic (PLEG): container finished" podID="774c8c4e-7ee4-4fb0-ad55-0894fa086b49" containerID="3db19f7765187b481852bd4c0413fef848216f9f42e46e1d977eba51a2e4dd6c" exitCode=0 Dec 01 11:51:32 crc kubenswrapper[4958]: I1201 11:51:32.828102 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"774c8c4e-7ee4-4fb0-ad55-0894fa086b49","Type":"ContainerDied","Data":"3db19f7765187b481852bd4c0413fef848216f9f42e46e1d977eba51a2e4dd6c"} Dec 01 11:51:33 crc kubenswrapper[4958]: I1201 11:51:33.672996 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:51:33 crc kubenswrapper[4958]: I1201 11:51:33.739673 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z2r66" Dec 01 11:51:33 crc kubenswrapper[4958]: I1201 11:51:33.814184 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 11:51:33 crc kubenswrapper[4958]: I1201 11:51:33.916142 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 11:51:33 crc kubenswrapper[4958]: I1201 11:51:33.916392 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zzzc4" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="registry-server" containerID="cri-o://aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97" gracePeriod=2 Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.618562 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.698461 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-utilities\") pod \"c23bec71-7e13-42d4-9f5c-23903e86f112\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.699222 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8qfh\" (UniqueName: \"kubernetes.io/projected/c23bec71-7e13-42d4-9f5c-23903e86f112-kube-api-access-c8qfh\") pod \"c23bec71-7e13-42d4-9f5c-23903e86f112\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.699148 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-utilities" (OuterVolumeSpecName: "utilities") pod "c23bec71-7e13-42d4-9f5c-23903e86f112" (UID: "c23bec71-7e13-42d4-9f5c-23903e86f112"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.699601 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-catalog-content\") pod \"c23bec71-7e13-42d4-9f5c-23903e86f112\" (UID: \"c23bec71-7e13-42d4-9f5c-23903e86f112\") " Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.700337 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.709202 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23bec71-7e13-42d4-9f5c-23903e86f112-kube-api-access-c8qfh" (OuterVolumeSpecName: "kube-api-access-c8qfh") pod "c23bec71-7e13-42d4-9f5c-23903e86f112" (UID: "c23bec71-7e13-42d4-9f5c-23903e86f112"). InnerVolumeSpecName "kube-api-access-c8qfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.773988 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c23bec71-7e13-42d4-9f5c-23903e86f112" (UID: "c23bec71-7e13-42d4-9f5c-23903e86f112"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.802284 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23bec71-7e13-42d4-9f5c-23903e86f112-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.802326 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8qfh\" (UniqueName: \"kubernetes.io/projected/c23bec71-7e13-42d4-9f5c-23903e86f112-kube-api-access-c8qfh\") on node \"crc\" DevicePath \"\"" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.856688 4958 generic.go:334] "Generic (PLEG): container finished" podID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerID="aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97" exitCode=0 Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.856766 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzzc4" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.857048 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzzc4" event={"ID":"c23bec71-7e13-42d4-9f5c-23903e86f112","Type":"ContainerDied","Data":"aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97"} Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.857154 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzzc4" event={"ID":"c23bec71-7e13-42d4-9f5c-23903e86f112","Type":"ContainerDied","Data":"24dc302f3131843b918a1a72192c07697de5c055102dc64730d8a3bcf9a246c6"} Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.857246 4958 scope.go:117] "RemoveContainer" containerID="aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.926229 4958 scope.go:117] "RemoveContainer" containerID="53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471" Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.931935 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 11:51:34 crc kubenswrapper[4958]: I1201 11:51:34.941070 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zzzc4"] Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.001666 4958 scope.go:117] "RemoveContainer" containerID="3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.092069 4958 scope.go:117] "RemoveContainer" containerID="aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97" Dec 01 11:51:35 crc kubenswrapper[4958]: E1201 11:51:35.097019 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97\": container with ID starting with aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97 not found: ID does not exist" containerID="aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.097080 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97"} err="failed to get container status \"aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97\": rpc error: code = NotFound desc = could not find container \"aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97\": container with ID starting with aeae2e2ae7bbf7485b7b6f2e99d587f374b979efcefd827893a324d77be9ce97 not found: ID does not exist" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.097109 4958 scope.go:117] "RemoveContainer" containerID="53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471" Dec 01 11:51:35 crc kubenswrapper[4958]: E1201 11:51:35.104035 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471\": container with ID starting with 53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471 not found: ID does not exist" containerID="53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.104095 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471"} err="failed to get container status \"53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471\": rpc error: code = NotFound desc = could not find container \"53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471\": container with ID starting with 53c5fc571c6af6d611d1af255f267f625734ddecd054aa2023a48c712755b471 not found: ID does not exist" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.104124 4958 scope.go:117] "RemoveContainer" containerID="3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99" Dec 01 11:51:35 crc kubenswrapper[4958]: E1201 11:51:35.108803 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99\": container with ID starting with 3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99 not found: ID does not exist" containerID="3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.108892 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99"} err="failed to get container status \"3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99\": rpc error: code = NotFound desc = could not find container \"3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99\": container with ID starting with 3f38eb48fa73f2d6533f7bbe3b90f1451e6ff4f10a93a166f701c928c9132c99 not found: ID does not exist" Dec 01 11:51:35 crc kubenswrapper[4958]: I1201 11:51:35.816600 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" path="/var/lib/kubelet/pods/c23bec71-7e13-42d4-9f5c-23903e86f112/volumes" Dec 01 11:51:37 crc kubenswrapper[4958]: I1201 11:51:37.950210 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc","Type":"ContainerStarted","Data":"cee164b2012b553dba9bbf540b143089ef7f4c63ec302d3b119f31d72a87cf3e"} Dec 01 11:51:40 crc kubenswrapper[4958]: I1201 11:51:40.984946 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"774c8c4e-7ee4-4fb0-ad55-0894fa086b49","Type":"ContainerStarted","Data":"25a9bf1a6424d594ebb6c9ec878c5691ce3dccb374b65693e036332114e1bf89"} Dec 01 11:51:41 crc kubenswrapper[4958]: I1201 11:51:41.034368 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-728c-account-create-d65k4"] Dec 01 11:51:41 crc kubenswrapper[4958]: I1201 11:51:41.045825 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-728c-account-create-d65k4"] Dec 01 11:51:41 crc kubenswrapper[4958]: I1201 11:51:41.810773 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce588a72-84ad-4d78-a641-4bd415c7ff03" path="/var/lib/kubelet/pods/ce588a72-84ad-4d78-a641-4bd415c7ff03/volumes" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.407142 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cjnrm"] Dec 01 11:51:42 crc kubenswrapper[4958]: E1201 11:51:42.407504 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="extract-utilities" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.407516 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="extract-utilities" Dec 01 11:51:42 crc kubenswrapper[4958]: E1201 11:51:42.407554 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="extract-content" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.407560 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="extract-content" Dec 01 11:51:42 crc kubenswrapper[4958]: E1201 11:51:42.407569 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="registry-server" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.407576 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="registry-server" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.407772 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23bec71-7e13-42d4-9f5c-23903e86f112" containerName="registry-server" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.409341 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.433239 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjnrm"] Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.604927 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-utilities\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.604999 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k26mw\" (UniqueName: \"kubernetes.io/projected/6b04597e-a60b-4828-a01b-9a8a2593e236-kube-api-access-k26mw\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.605024 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-catalog-content\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.707717 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-utilities\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.707822 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k26mw\" (UniqueName: \"kubernetes.io/projected/6b04597e-a60b-4828-a01b-9a8a2593e236-kube-api-access-k26mw\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.707893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-catalog-content\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.708521 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-utilities\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.708541 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-catalog-content\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:42 crc kubenswrapper[4958]: I1201 11:51:42.763511 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k26mw\" (UniqueName: \"kubernetes.io/projected/6b04597e-a60b-4828-a01b-9a8a2593e236-kube-api-access-k26mw\") pod \"certified-operators-cjnrm\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:43 crc kubenswrapper[4958]: I1201 11:51:43.064999 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:51:43 crc kubenswrapper[4958]: I1201 11:51:43.714784 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjnrm"] Dec 01 11:51:43 crc kubenswrapper[4958]: W1201 11:51:43.773250 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b04597e_a60b_4828_a01b_9a8a2593e236.slice/crio-89be741231437cb3409b8ebd00cfad991ede3b5e3136901c45f1aa56cc46c3e9 WatchSource:0}: Error finding container 89be741231437cb3409b8ebd00cfad991ede3b5e3136901c45f1aa56cc46c3e9: Status 404 returned error can't find the container with id 89be741231437cb3409b8ebd00cfad991ede3b5e3136901c45f1aa56cc46c3e9 Dec 01 11:51:44 crc kubenswrapper[4958]: I1201 11:51:44.025272 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"59f9a43a-9d58-4efb-ad19-9d1e4b632fdc","Type":"ContainerStarted","Data":"fa05338d6243e5938e033aec38ae62c540cd4f20bba7f4468bd6badfab89fc6c"} Dec 01 11:51:44 crc kubenswrapper[4958]: I1201 11:51:44.025531 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:44 crc kubenswrapper[4958]: I1201 11:51:44.029055 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 01 11:51:44 crc kubenswrapper[4958]: I1201 11:51:44.029792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerStarted","Data":"91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972"} Dec 01 11:51:44 crc kubenswrapper[4958]: I1201 11:51:44.029820 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerStarted","Data":"89be741231437cb3409b8ebd00cfad991ede3b5e3136901c45f1aa56cc46c3e9"} Dec 01 11:51:44 crc kubenswrapper[4958]: I1201 11:51:44.052518 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=9.318148072 podStartE2EDuration="29.052498632s" podCreationTimestamp="2025-12-01 11:51:15 +0000 UTC" firstStartedPulling="2025-12-01 11:51:17.691013584 +0000 UTC m=+6725.199802621" lastFinishedPulling="2025-12-01 11:51:37.425364144 +0000 UTC m=+6744.934153181" observedRunningTime="2025-12-01 11:51:44.049282121 +0000 UTC m=+6751.558071158" watchObservedRunningTime="2025-12-01 11:51:44.052498632 +0000 UTC m=+6751.561287669" Dec 01 11:51:45 crc kubenswrapper[4958]: I1201 11:51:45.042261 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"774c8c4e-7ee4-4fb0-ad55-0894fa086b49","Type":"ContainerStarted","Data":"bad4e9c5b288393f2a028e6e77456bb83a62d581b1064e42719f35596f9fc70f"} Dec 01 11:51:45 crc kubenswrapper[4958]: I1201 11:51:45.044033 4958 generic.go:334] "Generic (PLEG): container finished" podID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerID="91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972" exitCode=0 Dec 01 11:51:45 crc kubenswrapper[4958]: I1201 11:51:45.044177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerDied","Data":"91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972"} Dec 01 11:51:50 crc kubenswrapper[4958]: I1201 11:51:50.060443 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-pbb4g"] Dec 01 11:51:50 crc kubenswrapper[4958]: I1201 11:51:50.071299 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-pbb4g"] Dec 01 11:51:50 crc kubenswrapper[4958]: I1201 11:51:50.108168 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerStarted","Data":"4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e"} Dec 01 11:51:51 crc kubenswrapper[4958]: I1201 11:51:51.817079 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f0913e-7c66-4a70-866c-6fa041c5a169" path="/var/lib/kubelet/pods/38f0913e-7c66-4a70-866c-6fa041c5a169/volumes" Dec 01 11:51:52 crc kubenswrapper[4958]: I1201 11:51:52.465975 4958 generic.go:334] "Generic (PLEG): container finished" podID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerID="4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e" exitCode=0 Dec 01 11:51:52 crc kubenswrapper[4958]: I1201 11:51:52.466165 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerDied","Data":"4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e"} Dec 01 11:51:53 crc kubenswrapper[4958]: I1201 11:51:53.487269 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"774c8c4e-7ee4-4fb0-ad55-0894fa086b49","Type":"ContainerStarted","Data":"933efa7c5241957d9c2b653d6c34612bb806c089c607070acf8f2e2abcebf1fe"} Dec 01 11:51:53 crc kubenswrapper[4958]: I1201 11:51:53.530563 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.229237889 podStartE2EDuration="36.530546002s" podCreationTimestamp="2025-12-01 11:51:17 +0000 UTC" firstStartedPulling="2025-12-01 11:51:19.624410839 +0000 UTC m=+6727.133199876" lastFinishedPulling="2025-12-01 11:51:52.925718962 +0000 UTC m=+6760.434507989" observedRunningTime="2025-12-01 11:51:53.522935686 +0000 UTC m=+6761.031724723" watchObservedRunningTime="2025-12-01 11:51:53.530546002 +0000 UTC m=+6761.039335039" Dec 01 11:51:53 crc kubenswrapper[4958]: I1201 11:51:53.891798 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 01 11:51:54 crc kubenswrapper[4958]: I1201 11:51:54.498648 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerStarted","Data":"b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5"} Dec 01 11:51:54 crc kubenswrapper[4958]: I1201 11:51:54.526283 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cjnrm" podStartSLOduration=4.231329274 podStartE2EDuration="12.526264208s" podCreationTimestamp="2025-12-01 11:51:42 +0000 UTC" firstStartedPulling="2025-12-01 11:51:45.046805058 +0000 UTC m=+6752.555594095" lastFinishedPulling="2025-12-01 11:51:53.341739992 +0000 UTC m=+6760.850529029" observedRunningTime="2025-12-01 11:51:54.515648477 +0000 UTC m=+6762.024437524" watchObservedRunningTime="2025-12-01 11:51:54.526264208 +0000 UTC m=+6762.035053245" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.021749 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.025769 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.028447 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.028712 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.045699 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095532 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-log-httpd\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095587 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095636 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-run-httpd\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095752 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-scripts\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095776 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-config-data\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.095831 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd872\" (UniqueName: \"kubernetes.io/projected/90e696af-5f31-45c4-8b98-68420641252f-kube-api-access-rd872\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198522 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-scripts\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198593 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-config-data\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198622 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198668 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd872\" (UniqueName: \"kubernetes.io/projected/90e696af-5f31-45c4-8b98-68420641252f-kube-api-access-rd872\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198737 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-log-httpd\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198784 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.198890 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-run-httpd\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.199678 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-run-httpd\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.200009 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-log-httpd\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.205483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-config-data\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.205626 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-scripts\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.206960 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.207116 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.219222 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd872\" (UniqueName: \"kubernetes.io/projected/90e696af-5f31-45c4-8b98-68420641252f-kube-api-access-rd872\") pod \"ceilometer-0\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.351749 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:52:00 crc kubenswrapper[4958]: I1201 11:52:00.928258 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:01 crc kubenswrapper[4958]: I1201 11:52:01.578784 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerStarted","Data":"c8cb74343fe1207022fcbbf735f61494a162813fb73599700617afbd744ff289"} Dec 01 11:52:02 crc kubenswrapper[4958]: I1201 11:52:02.600618 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerStarted","Data":"fa8e8c4d31aa83928bfb4e27f4f9c9749df29e07331eb95eb7322c24b838bf36"} Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.064243 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.065271 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.122352 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.621785 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerStarted","Data":"4f508e08192756f06d83b65dfa9aec2a82a6e205ba3c9c1257d82a939bb4d6f1"} Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.622159 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerStarted","Data":"8871a39945aca48b9b93f8a51449ca400c96d4f3396bbc975addf67407f36c7e"} Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.701819 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.765710 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjnrm"] Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.901526 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 01 11:52:03 crc kubenswrapper[4958]: I1201 11:52:03.905195 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 01 11:52:04 crc kubenswrapper[4958]: I1201 11:52:04.632807 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 01 11:52:05 crc kubenswrapper[4958]: I1201 11:52:05.669384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerStarted","Data":"03ccd41ab0c3d3cb1266a9395a19c6283c68064627ff27b07dbd5c517ba166a8"} Dec 01 11:52:05 crc kubenswrapper[4958]: I1201 11:52:05.669552 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cjnrm" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="registry-server" containerID="cri-o://b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5" gracePeriod=2 Dec 01 11:52:05 crc kubenswrapper[4958]: I1201 11:52:05.671166 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 11:52:05 crc kubenswrapper[4958]: I1201 11:52:05.705919 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3625221229999998 podStartE2EDuration="6.705893791s" podCreationTimestamp="2025-12-01 11:51:59 +0000 UTC" firstStartedPulling="2025-12-01 11:52:00.928314917 +0000 UTC m=+6768.437103954" lastFinishedPulling="2025-12-01 11:52:05.271686595 +0000 UTC m=+6772.780475622" observedRunningTime="2025-12-01 11:52:05.69282861 +0000 UTC m=+6773.201617647" watchObservedRunningTime="2025-12-01 11:52:05.705893791 +0000 UTC m=+6773.214682838" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.224041 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.366959 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-catalog-content\") pod \"6b04597e-a60b-4828-a01b-9a8a2593e236\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.367296 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k26mw\" (UniqueName: \"kubernetes.io/projected/6b04597e-a60b-4828-a01b-9a8a2593e236-kube-api-access-k26mw\") pod \"6b04597e-a60b-4828-a01b-9a8a2593e236\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.367435 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-utilities\") pod \"6b04597e-a60b-4828-a01b-9a8a2593e236\" (UID: \"6b04597e-a60b-4828-a01b-9a8a2593e236\") " Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.368499 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-utilities" (OuterVolumeSpecName: "utilities") pod "6b04597e-a60b-4828-a01b-9a8a2593e236" (UID: "6b04597e-a60b-4828-a01b-9a8a2593e236"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.369204 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.374311 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b04597e-a60b-4828-a01b-9a8a2593e236-kube-api-access-k26mw" (OuterVolumeSpecName: "kube-api-access-k26mw") pod "6b04597e-a60b-4828-a01b-9a8a2593e236" (UID: "6b04597e-a60b-4828-a01b-9a8a2593e236"). InnerVolumeSpecName "kube-api-access-k26mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.420196 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b04597e-a60b-4828-a01b-9a8a2593e236" (UID: "6b04597e-a60b-4828-a01b-9a8a2593e236"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.471081 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b04597e-a60b-4828-a01b-9a8a2593e236-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.471119 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k26mw\" (UniqueName: \"kubernetes.io/projected/6b04597e-a60b-4828-a01b-9a8a2593e236-kube-api-access-k26mw\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.684229 4958 generic.go:334] "Generic (PLEG): container finished" podID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerID="b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5" exitCode=0 Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.684977 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjnrm" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.684999 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerDied","Data":"b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5"} Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.685122 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjnrm" event={"ID":"6b04597e-a60b-4828-a01b-9a8a2593e236","Type":"ContainerDied","Data":"89be741231437cb3409b8ebd00cfad991ede3b5e3136901c45f1aa56cc46c3e9"} Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.685169 4958 scope.go:117] "RemoveContainer" containerID="b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.719807 4958 scope.go:117] "RemoveContainer" containerID="4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.744232 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjnrm"] Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.755482 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cjnrm"] Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.783574 4958 scope.go:117] "RemoveContainer" containerID="91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.809835 4958 scope.go:117] "RemoveContainer" containerID="b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5" Dec 01 11:52:06 crc kubenswrapper[4958]: E1201 11:52:06.811002 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5\": container with ID starting with b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5 not found: ID does not exist" containerID="b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.811053 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5"} err="failed to get container status \"b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5\": rpc error: code = NotFound desc = could not find container \"b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5\": container with ID starting with b80901ced5b5efb8c447947f1c3b27a8c32073d18388aefd0f02034bbe804ee5 not found: ID does not exist" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.811083 4958 scope.go:117] "RemoveContainer" containerID="4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e" Dec 01 11:52:06 crc kubenswrapper[4958]: E1201 11:52:06.811488 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e\": container with ID starting with 4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e not found: ID does not exist" containerID="4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.811535 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e"} err="failed to get container status \"4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e\": rpc error: code = NotFound desc = could not find container \"4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e\": container with ID starting with 4de000d603758e73e6f3010b64884bc44544c8ea5b11ad907729540ff4c58c0e not found: ID does not exist" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.811568 4958 scope.go:117] "RemoveContainer" containerID="91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972" Dec 01 11:52:06 crc kubenswrapper[4958]: E1201 11:52:06.812003 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972\": container with ID starting with 91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972 not found: ID does not exist" containerID="91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972" Dec 01 11:52:06 crc kubenswrapper[4958]: I1201 11:52:06.812034 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972"} err="failed to get container status \"91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972\": rpc error: code = NotFound desc = could not find container \"91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972\": container with ID starting with 91b2431aa9034d1fa870354b08408dcb3283918220b5c3f59e68751aebfb4972 not found: ID does not exist" Dec 01 11:52:07 crc kubenswrapper[4958]: I1201 11:52:07.822263 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" path="/var/lib/kubelet/pods/6b04597e-a60b-4828-a01b-9a8a2593e236/volumes" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.844648 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-kl977"] Dec 01 11:52:11 crc kubenswrapper[4958]: E1201 11:52:11.846403 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="extract-content" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.846478 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="extract-content" Dec 01 11:52:11 crc kubenswrapper[4958]: E1201 11:52:11.846541 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="registry-server" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.846607 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="registry-server" Dec 01 11:52:11 crc kubenswrapper[4958]: E1201 11:52:11.846674 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="extract-utilities" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.846736 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="extract-utilities" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.847040 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b04597e-a60b-4828-a01b-9a8a2593e236" containerName="registry-server" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.847930 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kl977" Dec 01 11:52:11 crc kubenswrapper[4958]: I1201 11:52:11.849996 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kl977"] Dec 01 11:52:12 crc kubenswrapper[4958]: I1201 11:52:12.022353 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp9kj\" (UniqueName: \"kubernetes.io/projected/d5f5b51b-bc8c-472a-84ee-529d091dcd61-kube-api-access-gp9kj\") pod \"aodh-db-create-kl977\" (UID: \"d5f5b51b-bc8c-472a-84ee-529d091dcd61\") " pod="openstack/aodh-db-create-kl977" Dec 01 11:52:12 crc kubenswrapper[4958]: I1201 11:52:12.124924 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp9kj\" (UniqueName: \"kubernetes.io/projected/d5f5b51b-bc8c-472a-84ee-529d091dcd61-kube-api-access-gp9kj\") pod \"aodh-db-create-kl977\" (UID: \"d5f5b51b-bc8c-472a-84ee-529d091dcd61\") " pod="openstack/aodh-db-create-kl977" Dec 01 11:52:12 crc kubenswrapper[4958]: I1201 11:52:12.148501 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp9kj\" (UniqueName: \"kubernetes.io/projected/d5f5b51b-bc8c-472a-84ee-529d091dcd61-kube-api-access-gp9kj\") pod \"aodh-db-create-kl977\" (UID: \"d5f5b51b-bc8c-472a-84ee-529d091dcd61\") " pod="openstack/aodh-db-create-kl977" Dec 01 11:52:12 crc kubenswrapper[4958]: I1201 11:52:12.174834 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kl977" Dec 01 11:52:12 crc kubenswrapper[4958]: I1201 11:52:12.671585 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kl977"] Dec 01 11:52:12 crc kubenswrapper[4958]: W1201 11:52:12.677434 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5f5b51b_bc8c_472a_84ee_529d091dcd61.slice/crio-5bb0146bd2e5e2b577a10bfbff984375b1d3002d3007968b95152d8337fabe37 WatchSource:0}: Error finding container 5bb0146bd2e5e2b577a10bfbff984375b1d3002d3007968b95152d8337fabe37: Status 404 returned error can't find the container with id 5bb0146bd2e5e2b577a10bfbff984375b1d3002d3007968b95152d8337fabe37 Dec 01 11:52:12 crc kubenswrapper[4958]: I1201 11:52:12.786979 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kl977" event={"ID":"d5f5b51b-bc8c-472a-84ee-529d091dcd61","Type":"ContainerStarted","Data":"5bb0146bd2e5e2b577a10bfbff984375b1d3002d3007968b95152d8337fabe37"} Dec 01 11:52:13 crc kubenswrapper[4958]: I1201 11:52:13.806275 4958 generic.go:334] "Generic (PLEG): container finished" podID="d5f5b51b-bc8c-472a-84ee-529d091dcd61" containerID="3ae44865d4f762ce64b42289c02331119f04af0c07027618ad2e239b4e497ce2" exitCode=0 Dec 01 11:52:13 crc kubenswrapper[4958]: I1201 11:52:13.820530 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kl977" event={"ID":"d5f5b51b-bc8c-472a-84ee-529d091dcd61","Type":"ContainerDied","Data":"3ae44865d4f762ce64b42289c02331119f04af0c07027618ad2e239b4e497ce2"} Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.317896 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kl977" Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.402730 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp9kj\" (UniqueName: \"kubernetes.io/projected/d5f5b51b-bc8c-472a-84ee-529d091dcd61-kube-api-access-gp9kj\") pod \"d5f5b51b-bc8c-472a-84ee-529d091dcd61\" (UID: \"d5f5b51b-bc8c-472a-84ee-529d091dcd61\") " Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.408724 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f5b51b-bc8c-472a-84ee-529d091dcd61-kube-api-access-gp9kj" (OuterVolumeSpecName: "kube-api-access-gp9kj") pod "d5f5b51b-bc8c-472a-84ee-529d091dcd61" (UID: "d5f5b51b-bc8c-472a-84ee-529d091dcd61"). InnerVolumeSpecName "kube-api-access-gp9kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.506181 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp9kj\" (UniqueName: \"kubernetes.io/projected/d5f5b51b-bc8c-472a-84ee-529d091dcd61-kube-api-access-gp9kj\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.837979 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kl977" event={"ID":"d5f5b51b-bc8c-472a-84ee-529d091dcd61","Type":"ContainerDied","Data":"5bb0146bd2e5e2b577a10bfbff984375b1d3002d3007968b95152d8337fabe37"} Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.838329 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb0146bd2e5e2b577a10bfbff984375b1d3002d3007968b95152d8337fabe37" Dec 01 11:52:15 crc kubenswrapper[4958]: I1201 11:52:15.838050 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kl977" Dec 01 11:52:21 crc kubenswrapper[4958]: I1201 11:52:21.949063 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-a18f-account-create-vd96s"] Dec 01 11:52:21 crc kubenswrapper[4958]: E1201 11:52:21.950358 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f5b51b-bc8c-472a-84ee-529d091dcd61" containerName="mariadb-database-create" Dec 01 11:52:21 crc kubenswrapper[4958]: I1201 11:52:21.950376 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f5b51b-bc8c-472a-84ee-529d091dcd61" containerName="mariadb-database-create" Dec 01 11:52:21 crc kubenswrapper[4958]: I1201 11:52:21.950654 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f5b51b-bc8c-472a-84ee-529d091dcd61" containerName="mariadb-database-create" Dec 01 11:52:21 crc kubenswrapper[4958]: I1201 11:52:21.951672 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:21 crc kubenswrapper[4958]: I1201 11:52:21.956301 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 01 11:52:21 crc kubenswrapper[4958]: I1201 11:52:21.967442 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a18f-account-create-vd96s"] Dec 01 11:52:22 crc kubenswrapper[4958]: I1201 11:52:22.086860 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5k7x\" (UniqueName: \"kubernetes.io/projected/0286bf3f-7cae-4b16-8be5-d36b4e94198f-kube-api-access-k5k7x\") pod \"aodh-a18f-account-create-vd96s\" (UID: \"0286bf3f-7cae-4b16-8be5-d36b4e94198f\") " pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:22 crc kubenswrapper[4958]: I1201 11:52:22.189278 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5k7x\" (UniqueName: \"kubernetes.io/projected/0286bf3f-7cae-4b16-8be5-d36b4e94198f-kube-api-access-k5k7x\") pod \"aodh-a18f-account-create-vd96s\" (UID: \"0286bf3f-7cae-4b16-8be5-d36b4e94198f\") " pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:22 crc kubenswrapper[4958]: I1201 11:52:22.220961 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5k7x\" (UniqueName: \"kubernetes.io/projected/0286bf3f-7cae-4b16-8be5-d36b4e94198f-kube-api-access-k5k7x\") pod \"aodh-a18f-account-create-vd96s\" (UID: \"0286bf3f-7cae-4b16-8be5-d36b4e94198f\") " pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:22 crc kubenswrapper[4958]: I1201 11:52:22.287582 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:22 crc kubenswrapper[4958]: I1201 11:52:22.850417 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a18f-account-create-vd96s"] Dec 01 11:52:22 crc kubenswrapper[4958]: W1201 11:52:22.854391 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0286bf3f_7cae_4b16_8be5_d36b4e94198f.slice/crio-5d99c9d81f8784c799fa646a899e118b5a9c207ba7bba0ee4bd925aee720b414 WatchSource:0}: Error finding container 5d99c9d81f8784c799fa646a899e118b5a9c207ba7bba0ee4bd925aee720b414: Status 404 returned error can't find the container with id 5d99c9d81f8784c799fa646a899e118b5a9c207ba7bba0ee4bd925aee720b414 Dec 01 11:52:22 crc kubenswrapper[4958]: I1201 11:52:22.916999 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a18f-account-create-vd96s" event={"ID":"0286bf3f-7cae-4b16-8be5-d36b4e94198f","Type":"ContainerStarted","Data":"5d99c9d81f8784c799fa646a899e118b5a9c207ba7bba0ee4bd925aee720b414"} Dec 01 11:52:23 crc kubenswrapper[4958]: I1201 11:52:23.931335 4958 generic.go:334] "Generic (PLEG): container finished" podID="0286bf3f-7cae-4b16-8be5-d36b4e94198f" containerID="e5d08fbd2d3e26cb17444ed39d88772fbc97f46e6e3178f1f0602378251b0548" exitCode=0 Dec 01 11:52:23 crc kubenswrapper[4958]: I1201 11:52:23.931435 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a18f-account-create-vd96s" event={"ID":"0286bf3f-7cae-4b16-8be5-d36b4e94198f","Type":"ContainerDied","Data":"e5d08fbd2d3e26cb17444ed39d88772fbc97f46e6e3178f1f0602378251b0548"} Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.358684 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.547831 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5k7x\" (UniqueName: \"kubernetes.io/projected/0286bf3f-7cae-4b16-8be5-d36b4e94198f-kube-api-access-k5k7x\") pod \"0286bf3f-7cae-4b16-8be5-d36b4e94198f\" (UID: \"0286bf3f-7cae-4b16-8be5-d36b4e94198f\") " Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.553442 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0286bf3f-7cae-4b16-8be5-d36b4e94198f-kube-api-access-k5k7x" (OuterVolumeSpecName: "kube-api-access-k5k7x") pod "0286bf3f-7cae-4b16-8be5-d36b4e94198f" (UID: "0286bf3f-7cae-4b16-8be5-d36b4e94198f"). InnerVolumeSpecName "kube-api-access-k5k7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.651050 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5k7x\" (UniqueName: \"kubernetes.io/projected/0286bf3f-7cae-4b16-8be5-d36b4e94198f-kube-api-access-k5k7x\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.955882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a18f-account-create-vd96s" event={"ID":"0286bf3f-7cae-4b16-8be5-d36b4e94198f","Type":"ContainerDied","Data":"5d99c9d81f8784c799fa646a899e118b5a9c207ba7bba0ee4bd925aee720b414"} Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.956268 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d99c9d81f8784c799fa646a899e118b5a9c207ba7bba0ee4bd925aee720b414" Dec 01 11:52:25 crc kubenswrapper[4958]: I1201 11:52:25.956108 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a18f-account-create-vd96s" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.274817 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-p99r6"] Dec 01 11:52:27 crc kubenswrapper[4958]: E1201 11:52:27.275790 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0286bf3f-7cae-4b16-8be5-d36b4e94198f" containerName="mariadb-account-create" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.275808 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0286bf3f-7cae-4b16-8be5-d36b4e94198f" containerName="mariadb-account-create" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.276070 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0286bf3f-7cae-4b16-8be5-d36b4e94198f" containerName="mariadb-account-create" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.276876 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.279929 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.280026 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.280207 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-t84zj" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.295791 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-p99r6"] Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.390909 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gtrx\" (UniqueName: \"kubernetes.io/projected/f466c4e0-ff69-4257-869c-e83acba8328e-kube-api-access-6gtrx\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.391212 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-combined-ca-bundle\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.391506 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-config-data\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.392068 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-scripts\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.494081 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gtrx\" (UniqueName: \"kubernetes.io/projected/f466c4e0-ff69-4257-869c-e83acba8328e-kube-api-access-6gtrx\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.494151 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-combined-ca-bundle\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.494187 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-config-data\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.494273 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-scripts\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.500827 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-scripts\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.500952 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-combined-ca-bundle\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.502696 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-config-data\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.515292 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gtrx\" (UniqueName: \"kubernetes.io/projected/f466c4e0-ff69-4257-869c-e83acba8328e-kube-api-access-6gtrx\") pod \"aodh-db-sync-p99r6\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:27 crc kubenswrapper[4958]: I1201 11:52:27.597458 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:28 crc kubenswrapper[4958]: I1201 11:52:28.367577 4958 scope.go:117] "RemoveContainer" containerID="91afaee0e12431a85fdc4e2a2e3fe78638367e7086d12a1eaeb966d9c084fdfc" Dec 01 11:52:28 crc kubenswrapper[4958]: I1201 11:52:28.453809 4958 scope.go:117] "RemoveContainer" containerID="253a387d6e2a43ba415e346df479b409acb30bdb5936a2e10b5acc64f6bac87c" Dec 01 11:52:28 crc kubenswrapper[4958]: I1201 11:52:28.483640 4958 scope.go:117] "RemoveContainer" containerID="433ee3aa21c9e05d2b5fc308fd14344d8440ba40d7125e0c83738b7900e9418e" Dec 01 11:52:28 crc kubenswrapper[4958]: I1201 11:52:28.723298 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-p99r6"] Dec 01 11:52:29 crc kubenswrapper[4958]: I1201 11:52:29.403236 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-p99r6" event={"ID":"f466c4e0-ff69-4257-869c-e83acba8328e","Type":"ContainerStarted","Data":"4742026c82506e33a8631101befc90737d84e52c251656e73773bec3006cc56d"} Dec 01 11:52:30 crc kubenswrapper[4958]: I1201 11:52:30.359620 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 11:52:34 crc kubenswrapper[4958]: I1201 11:52:34.458689 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-p99r6" event={"ID":"f466c4e0-ff69-4257-869c-e83acba8328e","Type":"ContainerStarted","Data":"3d48a1f636b4565c60ce545421ad662e2f30a907ee3ff71fb35e7864478bf38f"} Dec 01 11:52:34 crc kubenswrapper[4958]: I1201 11:52:34.486531 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-p99r6" podStartSLOduration=2.449137455 podStartE2EDuration="7.486512214s" podCreationTimestamp="2025-12-01 11:52:27 +0000 UTC" firstStartedPulling="2025-12-01 11:52:28.729949819 +0000 UTC m=+6796.238738856" lastFinishedPulling="2025-12-01 11:52:33.767324578 +0000 UTC m=+6801.276113615" observedRunningTime="2025-12-01 11:52:34.477879119 +0000 UTC m=+6801.986668176" watchObservedRunningTime="2025-12-01 11:52:34.486512214 +0000 UTC m=+6801.995301251" Dec 01 11:52:36 crc kubenswrapper[4958]: I1201 11:52:36.490917 4958 generic.go:334] "Generic (PLEG): container finished" podID="f466c4e0-ff69-4257-869c-e83acba8328e" containerID="3d48a1f636b4565c60ce545421ad662e2f30a907ee3ff71fb35e7864478bf38f" exitCode=0 Dec 01 11:52:36 crc kubenswrapper[4958]: I1201 11:52:36.491063 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-p99r6" event={"ID":"f466c4e0-ff69-4257-869c-e83acba8328e","Type":"ContainerDied","Data":"3d48a1f636b4565c60ce545421ad662e2f30a907ee3ff71fb35e7864478bf38f"} Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.042595 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.097653 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-config-data\") pod \"f466c4e0-ff69-4257-869c-e83acba8328e\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.097722 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gtrx\" (UniqueName: \"kubernetes.io/projected/f466c4e0-ff69-4257-869c-e83acba8328e-kube-api-access-6gtrx\") pod \"f466c4e0-ff69-4257-869c-e83acba8328e\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.097807 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-combined-ca-bundle\") pod \"f466c4e0-ff69-4257-869c-e83acba8328e\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.097866 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-scripts\") pod \"f466c4e0-ff69-4257-869c-e83acba8328e\" (UID: \"f466c4e0-ff69-4257-869c-e83acba8328e\") " Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.104480 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f466c4e0-ff69-4257-869c-e83acba8328e-kube-api-access-6gtrx" (OuterVolumeSpecName: "kube-api-access-6gtrx") pod "f466c4e0-ff69-4257-869c-e83acba8328e" (UID: "f466c4e0-ff69-4257-869c-e83acba8328e"). InnerVolumeSpecName "kube-api-access-6gtrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.110031 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-scripts" (OuterVolumeSpecName: "scripts") pod "f466c4e0-ff69-4257-869c-e83acba8328e" (UID: "f466c4e0-ff69-4257-869c-e83acba8328e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.129625 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-config-data" (OuterVolumeSpecName: "config-data") pod "f466c4e0-ff69-4257-869c-e83acba8328e" (UID: "f466c4e0-ff69-4257-869c-e83acba8328e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.140193 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f466c4e0-ff69-4257-869c-e83acba8328e" (UID: "f466c4e0-ff69-4257-869c-e83acba8328e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.199948 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.199987 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gtrx\" (UniqueName: \"kubernetes.io/projected/f466c4e0-ff69-4257-869c-e83acba8328e-kube-api-access-6gtrx\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.199999 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.200006 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f466c4e0-ff69-4257-869c-e83acba8328e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.516659 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-p99r6" event={"ID":"f466c4e0-ff69-4257-869c-e83acba8328e","Type":"ContainerDied","Data":"4742026c82506e33a8631101befc90737d84e52c251656e73773bec3006cc56d"} Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.517052 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4742026c82506e33a8631101befc90737d84e52c251656e73773bec3006cc56d" Dec 01 11:52:38 crc kubenswrapper[4958]: I1201 11:52:38.516781 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-p99r6" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.373981 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 01 11:52:42 crc kubenswrapper[4958]: E1201 11:52:42.375963 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f466c4e0-ff69-4257-869c-e83acba8328e" containerName="aodh-db-sync" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.376002 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f466c4e0-ff69-4257-869c-e83acba8328e" containerName="aodh-db-sync" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.376299 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f466c4e0-ff69-4257-869c-e83acba8328e" containerName="aodh-db-sync" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.379008 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.383287 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.383555 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.388586 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-t84zj" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.410794 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.497430 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.497605 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb5cz\" (UniqueName: \"kubernetes.io/projected/920335c0-76db-45f4-abb1-4bbcdff5e47d-kube-api-access-jb5cz\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.497645 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-config-data\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.499016 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-scripts\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.601096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb5cz\" (UniqueName: \"kubernetes.io/projected/920335c0-76db-45f4-abb1-4bbcdff5e47d-kube-api-access-jb5cz\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.601172 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-config-data\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.601330 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-scripts\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.601357 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.607077 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-scripts\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.607935 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.608990 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920335c0-76db-45f4-abb1-4bbcdff5e47d-config-data\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.621829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb5cz\" (UniqueName: \"kubernetes.io/projected/920335c0-76db-45f4-abb1-4bbcdff5e47d-kube-api-access-jb5cz\") pod \"aodh-0\" (UID: \"920335c0-76db-45f4-abb1-4bbcdff5e47d\") " pod="openstack/aodh-0" Dec 01 11:52:42 crc kubenswrapper[4958]: I1201 11:52:42.706721 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.279705 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.587359 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"920335c0-76db-45f4-abb1-4bbcdff5e47d","Type":"ContainerStarted","Data":"60551879e1c76cfc8d2fbc11b1131d8d43f4ff13bc4a106a9fcac7e9a0224aab"} Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.605451 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.621144 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="proxy-httpd" containerID="cri-o://03ccd41ab0c3d3cb1266a9395a19c6283c68064627ff27b07dbd5c517ba166a8" gracePeriod=30 Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.621534 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="sg-core" containerID="cri-o://4f508e08192756f06d83b65dfa9aec2a82a6e205ba3c9c1257d82a939bb4d6f1" gracePeriod=30 Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.621615 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-notification-agent" containerID="cri-o://8871a39945aca48b9b93f8a51449ca400c96d4f3396bbc975addf67407f36c7e" gracePeriod=30 Dec 01 11:52:43 crc kubenswrapper[4958]: I1201 11:52:43.624094 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-central-agent" containerID="cri-o://fa8e8c4d31aa83928bfb4e27f4f9c9749df29e07331eb95eb7322c24b838bf36" gracePeriod=30 Dec 01 11:52:44 crc kubenswrapper[4958]: I1201 11:52:44.599481 4958 generic.go:334] "Generic (PLEG): container finished" podID="90e696af-5f31-45c4-8b98-68420641252f" containerID="03ccd41ab0c3d3cb1266a9395a19c6283c68064627ff27b07dbd5c517ba166a8" exitCode=0 Dec 01 11:52:44 crc kubenswrapper[4958]: I1201 11:52:44.599783 4958 generic.go:334] "Generic (PLEG): container finished" podID="90e696af-5f31-45c4-8b98-68420641252f" containerID="4f508e08192756f06d83b65dfa9aec2a82a6e205ba3c9c1257d82a939bb4d6f1" exitCode=2 Dec 01 11:52:44 crc kubenswrapper[4958]: I1201 11:52:44.599796 4958 generic.go:334] "Generic (PLEG): container finished" podID="90e696af-5f31-45c4-8b98-68420641252f" containerID="fa8e8c4d31aa83928bfb4e27f4f9c9749df29e07331eb95eb7322c24b838bf36" exitCode=0 Dec 01 11:52:44 crc kubenswrapper[4958]: I1201 11:52:44.599552 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerDied","Data":"03ccd41ab0c3d3cb1266a9395a19c6283c68064627ff27b07dbd5c517ba166a8"} Dec 01 11:52:44 crc kubenswrapper[4958]: I1201 11:52:44.599866 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerDied","Data":"4f508e08192756f06d83b65dfa9aec2a82a6e205ba3c9c1257d82a939bb4d6f1"} Dec 01 11:52:44 crc kubenswrapper[4958]: I1201 11:52:44.599884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerDied","Data":"fa8e8c4d31aa83928bfb4e27f4f9c9749df29e07331eb95eb7322c24b838bf36"} Dec 01 11:52:45 crc kubenswrapper[4958]: I1201 11:52:45.634755 4958 generic.go:334] "Generic (PLEG): container finished" podID="90e696af-5f31-45c4-8b98-68420641252f" containerID="8871a39945aca48b9b93f8a51449ca400c96d4f3396bbc975addf67407f36c7e" exitCode=0 Dec 01 11:52:45 crc kubenswrapper[4958]: I1201 11:52:45.634901 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerDied","Data":"8871a39945aca48b9b93f8a51449ca400c96d4f3396bbc975addf67407f36c7e"} Dec 01 11:52:45 crc kubenswrapper[4958]: I1201 11:52:45.645372 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"920335c0-76db-45f4-abb1-4bbcdff5e47d","Type":"ContainerStarted","Data":"c1ada2e5ba97a7646fcd78ffaf20054b815b6e9ebd7684ccda883f191c67e60f"} Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.003292 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-scripts\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091419 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-config-data\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091447 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-sg-core-conf-yaml\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091495 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-log-httpd\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091525 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-combined-ca-bundle\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091615 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd872\" (UniqueName: \"kubernetes.io/projected/90e696af-5f31-45c4-8b98-68420641252f-kube-api-access-rd872\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.091740 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-run-httpd\") pod \"90e696af-5f31-45c4-8b98-68420641252f\" (UID: \"90e696af-5f31-45c4-8b98-68420641252f\") " Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.095060 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.095398 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.099325 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e696af-5f31-45c4-8b98-68420641252f-kube-api-access-rd872" (OuterVolumeSpecName: "kube-api-access-rd872") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "kube-api-access-rd872". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.102440 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-scripts" (OuterVolumeSpecName: "scripts") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.136133 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.190019 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.194697 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.194723 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.194732 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.194743 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd872\" (UniqueName: \"kubernetes.io/projected/90e696af-5f31-45c4-8b98-68420641252f-kube-api-access-rd872\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.194755 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90e696af-5f31-45c4-8b98-68420641252f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.194763 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.220496 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-config-data" (OuterVolumeSpecName: "config-data") pod "90e696af-5f31-45c4-8b98-68420641252f" (UID: "90e696af-5f31-45c4-8b98-68420641252f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.296881 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e696af-5f31-45c4-8b98-68420641252f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.662948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"90e696af-5f31-45c4-8b98-68420641252f","Type":"ContainerDied","Data":"c8cb74343fe1207022fcbbf735f61494a162813fb73599700617afbd744ff289"} Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.663031 4958 scope.go:117] "RemoveContainer" containerID="03ccd41ab0c3d3cb1266a9395a19c6283c68064627ff27b07dbd5c517ba166a8" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.663065 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.690637 4958 scope.go:117] "RemoveContainer" containerID="4f508e08192756f06d83b65dfa9aec2a82a6e205ba3c9c1257d82a939bb4d6f1" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.732922 4958 scope.go:117] "RemoveContainer" containerID="8871a39945aca48b9b93f8a51449ca400c96d4f3396bbc975addf67407f36c7e" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.756386 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.789154 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.795887 4958 scope.go:117] "RemoveContainer" containerID="fa8e8c4d31aa83928bfb4e27f4f9c9749df29e07331eb95eb7322c24b838bf36" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.804859 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:46 crc kubenswrapper[4958]: E1201 11:52:46.805550 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-notification-agent" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.805574 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-notification-agent" Dec 01 11:52:46 crc kubenswrapper[4958]: E1201 11:52:46.805672 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-central-agent" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.805687 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-central-agent" Dec 01 11:52:46 crc kubenswrapper[4958]: E1201 11:52:46.805723 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="proxy-httpd" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.805734 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="proxy-httpd" Dec 01 11:52:46 crc kubenswrapper[4958]: E1201 11:52:46.805747 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="sg-core" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.805755 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="sg-core" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.809199 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="proxy-httpd" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.809258 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-central-agent" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.809281 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="sg-core" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.809305 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e696af-5f31-45c4-8b98-68420641252f" containerName="ceilometer-notification-agent" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.812018 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.817490 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.817817 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.826339 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.916650 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-scripts\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.916786 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.916888 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-config-data\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.916935 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-log-httpd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.916996 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-run-httpd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.917088 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:46 crc kubenswrapper[4958]: I1201 11:52:46.917179 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbwxd\" (UniqueName: \"kubernetes.io/projected/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-kube-api-access-fbwxd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019570 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-scripts\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019651 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019710 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-config-data\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019747 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-log-httpd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019817 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-run-httpd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019912 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.019994 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbwxd\" (UniqueName: \"kubernetes.io/projected/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-kube-api-access-fbwxd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.020950 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-log-httpd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.021056 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-run-httpd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.029751 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.038196 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-config-data\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.038382 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.038969 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-scripts\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.053832 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbwxd\" (UniqueName: \"kubernetes.io/projected/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-kube-api-access-fbwxd\") pod \"ceilometer-0\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.149363 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.666763 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:52:47 crc kubenswrapper[4958]: W1201 11:52:47.674373 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491a5f1b_7744_4aa2_bafc_12fd3eb3f34b.slice/crio-aeb0d3f402d4965580222b6437188df10d49a04a03a33ac4fd03d684d7d60ac6 WatchSource:0}: Error finding container aeb0d3f402d4965580222b6437188df10d49a04a03a33ac4fd03d684d7d60ac6: Status 404 returned error can't find the container with id aeb0d3f402d4965580222b6437188df10d49a04a03a33ac4fd03d684d7d60ac6 Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.688704 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"920335c0-76db-45f4-abb1-4bbcdff5e47d","Type":"ContainerStarted","Data":"99765c53e0a895a3711ce1a07cdfbb2037545e9f0bd3bffe7b8fc07d170a3a03"} Dec 01 11:52:47 crc kubenswrapper[4958]: I1201 11:52:47.819128 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e696af-5f31-45c4-8b98-68420641252f" path="/var/lib/kubelet/pods/90e696af-5f31-45c4-8b98-68420641252f/volumes" Dec 01 11:52:48 crc kubenswrapper[4958]: I1201 11:52:48.711803 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"920335c0-76db-45f4-abb1-4bbcdff5e47d","Type":"ContainerStarted","Data":"89221e530fe77011a6726bc942e0be02bad6585dc3de9b0f56b02c8f118b92f2"} Dec 01 11:52:48 crc kubenswrapper[4958]: I1201 11:52:48.715787 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerStarted","Data":"aeb0d3f402d4965580222b6437188df10d49a04a03a33ac4fd03d684d7d60ac6"} Dec 01 11:52:49 crc kubenswrapper[4958]: I1201 11:52:49.724494 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerStarted","Data":"35a36952ed148e073ffd7d37463bb0ad6edb443d866abc800f1e2b522230e055"} Dec 01 11:52:50 crc kubenswrapper[4958]: I1201 11:52:50.735773 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerStarted","Data":"03c6b29b1726a9911e58a6ba1e47b30e4b41a1c357399ded76437e23fd50e2ae"} Dec 01 11:52:50 crc kubenswrapper[4958]: I1201 11:52:50.739887 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"920335c0-76db-45f4-abb1-4bbcdff5e47d","Type":"ContainerStarted","Data":"6ddd3d5bba2a14c2d0f0ddc12877b89e4aabeaaec94bfd8beedef304c2d544de"} Dec 01 11:52:51 crc kubenswrapper[4958]: I1201 11:52:51.761444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerStarted","Data":"185d48e4a534dd3adefc724734af7615dfe4d05250b9d9420af9569f4de1987a"} Dec 01 11:52:53 crc kubenswrapper[4958]: I1201 11:52:53.828349 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerStarted","Data":"3ae96a898072dcca1fb5c1ceebf9f5ef0d1e8b0ea86a0a50469c4275fd39daf9"} Dec 01 11:52:53 crc kubenswrapper[4958]: I1201 11:52:53.829041 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 11:52:53 crc kubenswrapper[4958]: I1201 11:52:53.854130 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=5.164244234 podStartE2EDuration="11.854107052s" podCreationTimestamp="2025-12-01 11:52:42 +0000 UTC" firstStartedPulling="2025-12-01 11:52:43.292235527 +0000 UTC m=+6810.801024564" lastFinishedPulling="2025-12-01 11:52:49.982098345 +0000 UTC m=+6817.490887382" observedRunningTime="2025-12-01 11:52:50.77414753 +0000 UTC m=+6818.282936567" watchObservedRunningTime="2025-12-01 11:52:53.854107052 +0000 UTC m=+6821.362896089" Dec 01 11:52:53 crc kubenswrapper[4958]: I1201 11:52:53.895691 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.066223785 podStartE2EDuration="7.895673142s" podCreationTimestamp="2025-12-01 11:52:46 +0000 UTC" firstStartedPulling="2025-12-01 11:52:47.676729202 +0000 UTC m=+6815.185518239" lastFinishedPulling="2025-12-01 11:52:52.506178559 +0000 UTC m=+6820.014967596" observedRunningTime="2025-12-01 11:52:53.884218247 +0000 UTC m=+6821.393007284" watchObservedRunningTime="2025-12-01 11:52:53.895673142 +0000 UTC m=+6821.404462179" Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.428947 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-nxk9p"] Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.431386 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-nxk9p" Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.462090 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-nxk9p"] Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.488441 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xnth\" (UniqueName: \"kubernetes.io/projected/53cce1d1-fae6-4fa7-89ec-1b47571a9933-kube-api-access-6xnth\") pod \"manila-db-create-nxk9p\" (UID: \"53cce1d1-fae6-4fa7-89ec-1b47571a9933\") " pod="openstack/manila-db-create-nxk9p" Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.590529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xnth\" (UniqueName: \"kubernetes.io/projected/53cce1d1-fae6-4fa7-89ec-1b47571a9933-kube-api-access-6xnth\") pod \"manila-db-create-nxk9p\" (UID: \"53cce1d1-fae6-4fa7-89ec-1b47571a9933\") " pod="openstack/manila-db-create-nxk9p" Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.622619 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xnth\" (UniqueName: \"kubernetes.io/projected/53cce1d1-fae6-4fa7-89ec-1b47571a9933-kube-api-access-6xnth\") pod \"manila-db-create-nxk9p\" (UID: \"53cce1d1-fae6-4fa7-89ec-1b47571a9933\") " pod="openstack/manila-db-create-nxk9p" Dec 01 11:52:56 crc kubenswrapper[4958]: I1201 11:52:56.756669 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-nxk9p" Dec 01 11:52:57 crc kubenswrapper[4958]: W1201 11:52:57.293092 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53cce1d1_fae6_4fa7_89ec_1b47571a9933.slice/crio-3a0edae7079d35ece820ed71518601b8e0ceec6c8beed432682c3ae9cc6e628a WatchSource:0}: Error finding container 3a0edae7079d35ece820ed71518601b8e0ceec6c8beed432682c3ae9cc6e628a: Status 404 returned error can't find the container with id 3a0edae7079d35ece820ed71518601b8e0ceec6c8beed432682c3ae9cc6e628a Dec 01 11:52:57 crc kubenswrapper[4958]: I1201 11:52:57.301733 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-nxk9p"] Dec 01 11:52:57 crc kubenswrapper[4958]: I1201 11:52:57.891598 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-nxk9p" event={"ID":"53cce1d1-fae6-4fa7-89ec-1b47571a9933","Type":"ContainerDied","Data":"d2ef83e6d1751ccfa6ab221b3afdde7b734bc1ee8b4c2e0ac473c0787004c47d"} Dec 01 11:52:57 crc kubenswrapper[4958]: I1201 11:52:57.892292 4958 generic.go:334] "Generic (PLEG): container finished" podID="53cce1d1-fae6-4fa7-89ec-1b47571a9933" containerID="d2ef83e6d1751ccfa6ab221b3afdde7b734bc1ee8b4c2e0ac473c0787004c47d" exitCode=0 Dec 01 11:52:57 crc kubenswrapper[4958]: I1201 11:52:57.892378 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-nxk9p" event={"ID":"53cce1d1-fae6-4fa7-89ec-1b47571a9933","Type":"ContainerStarted","Data":"3a0edae7079d35ece820ed71518601b8e0ceec6c8beed432682c3ae9cc6e628a"} Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.341667 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-nxk9p" Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.468192 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xnth\" (UniqueName: \"kubernetes.io/projected/53cce1d1-fae6-4fa7-89ec-1b47571a9933-kube-api-access-6xnth\") pod \"53cce1d1-fae6-4fa7-89ec-1b47571a9933\" (UID: \"53cce1d1-fae6-4fa7-89ec-1b47571a9933\") " Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.474602 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cce1d1-fae6-4fa7-89ec-1b47571a9933-kube-api-access-6xnth" (OuterVolumeSpecName: "kube-api-access-6xnth") pod "53cce1d1-fae6-4fa7-89ec-1b47571a9933" (UID: "53cce1d1-fae6-4fa7-89ec-1b47571a9933"). InnerVolumeSpecName "kube-api-access-6xnth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.572435 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xnth\" (UniqueName: \"kubernetes.io/projected/53cce1d1-fae6-4fa7-89ec-1b47571a9933-kube-api-access-6xnth\") on node \"crc\" DevicePath \"\"" Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.917177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-nxk9p" event={"ID":"53cce1d1-fae6-4fa7-89ec-1b47571a9933","Type":"ContainerDied","Data":"3a0edae7079d35ece820ed71518601b8e0ceec6c8beed432682c3ae9cc6e628a"} Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.917225 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a0edae7079d35ece820ed71518601b8e0ceec6c8beed432682c3ae9cc6e628a" Dec 01 11:52:59 crc kubenswrapper[4958]: I1201 11:52:59.917278 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-nxk9p" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.563398 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-bc9b-account-create-fjkr8"] Dec 01 11:53:06 crc kubenswrapper[4958]: E1201 11:53:06.564328 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cce1d1-fae6-4fa7-89ec-1b47571a9933" containerName="mariadb-database-create" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.564342 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cce1d1-fae6-4fa7-89ec-1b47571a9933" containerName="mariadb-database-create" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.564606 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cce1d1-fae6-4fa7-89ec-1b47571a9933" containerName="mariadb-database-create" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.565456 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.568231 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.589583 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-bc9b-account-create-fjkr8"] Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.740272 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-896qh\" (UniqueName: \"kubernetes.io/projected/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779-kube-api-access-896qh\") pod \"manila-bc9b-account-create-fjkr8\" (UID: \"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779\") " pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.843457 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-896qh\" (UniqueName: \"kubernetes.io/projected/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779-kube-api-access-896qh\") pod \"manila-bc9b-account-create-fjkr8\" (UID: \"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779\") " pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.878423 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-896qh\" (UniqueName: \"kubernetes.io/projected/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779-kube-api-access-896qh\") pod \"manila-bc9b-account-create-fjkr8\" (UID: \"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779\") " pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:06 crc kubenswrapper[4958]: I1201 11:53:06.900145 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:07 crc kubenswrapper[4958]: I1201 11:53:07.480382 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-bc9b-account-create-fjkr8"] Dec 01 11:53:07 crc kubenswrapper[4958]: W1201 11:53:07.484198 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d8f3dc_cd11_4f1e_8b62_9e8ad0f92779.slice/crio-e1fa5640b277f539eb0c922be3b9ee945d2ee7db333028fdebb3da6c3fa100dc WatchSource:0}: Error finding container e1fa5640b277f539eb0c922be3b9ee945d2ee7db333028fdebb3da6c3fa100dc: Status 404 returned error can't find the container with id e1fa5640b277f539eb0c922be3b9ee945d2ee7db333028fdebb3da6c3fa100dc Dec 01 11:53:08 crc kubenswrapper[4958]: I1201 11:53:08.008316 4958 generic.go:334] "Generic (PLEG): container finished" podID="a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779" containerID="8899171db4db2e26e00e6d359ac2f8d8cfb881c6619d2c718a8c0ee53064de09" exitCode=0 Dec 01 11:53:08 crc kubenswrapper[4958]: I1201 11:53:08.008409 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-bc9b-account-create-fjkr8" event={"ID":"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779","Type":"ContainerDied","Data":"8899171db4db2e26e00e6d359ac2f8d8cfb881c6619d2c718a8c0ee53064de09"} Dec 01 11:53:08 crc kubenswrapper[4958]: I1201 11:53:08.008920 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-bc9b-account-create-fjkr8" event={"ID":"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779","Type":"ContainerStarted","Data":"e1fa5640b277f539eb0c922be3b9ee945d2ee7db333028fdebb3da6c3fa100dc"} Dec 01 11:53:09 crc kubenswrapper[4958]: I1201 11:53:09.512340 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:09 crc kubenswrapper[4958]: I1201 11:53:09.648425 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-896qh\" (UniqueName: \"kubernetes.io/projected/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779-kube-api-access-896qh\") pod \"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779\" (UID: \"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779\") " Dec 01 11:53:09 crc kubenswrapper[4958]: I1201 11:53:09.655977 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779-kube-api-access-896qh" (OuterVolumeSpecName: "kube-api-access-896qh") pod "a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779" (UID: "a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779"). InnerVolumeSpecName "kube-api-access-896qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:53:09 crc kubenswrapper[4958]: I1201 11:53:09.751003 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-896qh\" (UniqueName: \"kubernetes.io/projected/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779-kube-api-access-896qh\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:10 crc kubenswrapper[4958]: I1201 11:53:10.038143 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-bc9b-account-create-fjkr8" event={"ID":"a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779","Type":"ContainerDied","Data":"e1fa5640b277f539eb0c922be3b9ee945d2ee7db333028fdebb3da6c3fa100dc"} Dec 01 11:53:10 crc kubenswrapper[4958]: I1201 11:53:10.038194 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-bc9b-account-create-fjkr8" Dec 01 11:53:10 crc kubenswrapper[4958]: I1201 11:53:10.038201 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1fa5640b277f539eb0c922be3b9ee945d2ee7db333028fdebb3da6c3fa100dc" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.880340 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-2fzfc"] Dec 01 11:53:11 crc kubenswrapper[4958]: E1201 11:53:11.881349 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779" containerName="mariadb-account-create" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.881367 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779" containerName="mariadb-account-create" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.881596 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779" containerName="mariadb-account-create" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.882471 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.884605 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-kpsck" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.884886 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.913123 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-2fzfc"] Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.924796 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-combined-ca-bundle\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.924975 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-config-data\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.925055 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-job-config-data\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:11 crc kubenswrapper[4958]: I1201 11:53:11.925292 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmvq\" (UniqueName: \"kubernetes.io/projected/45598b35-f5b0-49b6-95e1-cae26b23a10c-kube-api-access-twmvq\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.027462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-job-config-data\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.027818 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twmvq\" (UniqueName: \"kubernetes.io/projected/45598b35-f5b0-49b6-95e1-cae26b23a10c-kube-api-access-twmvq\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.027891 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-combined-ca-bundle\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.027997 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-config-data\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.036185 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-job-config-data\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.036445 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-combined-ca-bundle\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.036662 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-config-data\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.045826 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmvq\" (UniqueName: \"kubernetes.io/projected/45598b35-f5b0-49b6-95e1-cae26b23a10c-kube-api-access-twmvq\") pod \"manila-db-sync-2fzfc\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.222987 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:12 crc kubenswrapper[4958]: I1201 11:53:12.825713 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-2fzfc"] Dec 01 11:53:13 crc kubenswrapper[4958]: I1201 11:53:13.076031 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2fzfc" event={"ID":"45598b35-f5b0-49b6-95e1-cae26b23a10c","Type":"ContainerStarted","Data":"70b87da7390fccbf1c276ab3e82caa78878c66fe90e0f1e5e93b75861625ad6b"} Dec 01 11:53:17 crc kubenswrapper[4958]: I1201 11:53:17.206021 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 11:53:20 crc kubenswrapper[4958]: I1201 11:53:20.180217 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2fzfc" event={"ID":"45598b35-f5b0-49b6-95e1-cae26b23a10c","Type":"ContainerStarted","Data":"076e1c2d2b5cc2628c6ab726d2ac8328390e35f950db36dacc3ca8812b090c8d"} Dec 01 11:53:20 crc kubenswrapper[4958]: I1201 11:53:20.195894 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-2fzfc" podStartSLOduration=3.479404205 podStartE2EDuration="9.195875631s" podCreationTimestamp="2025-12-01 11:53:11 +0000 UTC" firstStartedPulling="2025-12-01 11:53:12.811650491 +0000 UTC m=+6840.320439538" lastFinishedPulling="2025-12-01 11:53:18.528121917 +0000 UTC m=+6846.036910964" observedRunningTime="2025-12-01 11:53:20.194789791 +0000 UTC m=+6847.703578838" watchObservedRunningTime="2025-12-01 11:53:20.195875631 +0000 UTC m=+6847.704664668" Dec 01 11:53:22 crc kubenswrapper[4958]: I1201 11:53:22.200483 4958 generic.go:334] "Generic (PLEG): container finished" podID="45598b35-f5b0-49b6-95e1-cae26b23a10c" containerID="076e1c2d2b5cc2628c6ab726d2ac8328390e35f950db36dacc3ca8812b090c8d" exitCode=0 Dec 01 11:53:22 crc kubenswrapper[4958]: I1201 11:53:22.200898 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2fzfc" event={"ID":"45598b35-f5b0-49b6-95e1-cae26b23a10c","Type":"ContainerDied","Data":"076e1c2d2b5cc2628c6ab726d2ac8328390e35f950db36dacc3ca8812b090c8d"} Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.846776 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.982634 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twmvq\" (UniqueName: \"kubernetes.io/projected/45598b35-f5b0-49b6-95e1-cae26b23a10c-kube-api-access-twmvq\") pod \"45598b35-f5b0-49b6-95e1-cae26b23a10c\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.982704 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-config-data\") pod \"45598b35-f5b0-49b6-95e1-cae26b23a10c\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.982804 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-job-config-data\") pod \"45598b35-f5b0-49b6-95e1-cae26b23a10c\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.982893 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-combined-ca-bundle\") pod \"45598b35-f5b0-49b6-95e1-cae26b23a10c\" (UID: \"45598b35-f5b0-49b6-95e1-cae26b23a10c\") " Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.988678 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45598b35-f5b0-49b6-95e1-cae26b23a10c-kube-api-access-twmvq" (OuterVolumeSpecName: "kube-api-access-twmvq") pod "45598b35-f5b0-49b6-95e1-cae26b23a10c" (UID: "45598b35-f5b0-49b6-95e1-cae26b23a10c"). InnerVolumeSpecName "kube-api-access-twmvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.992640 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "45598b35-f5b0-49b6-95e1-cae26b23a10c" (UID: "45598b35-f5b0-49b6-95e1-cae26b23a10c"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:23 crc kubenswrapper[4958]: I1201 11:53:23.995622 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-config-data" (OuterVolumeSpecName: "config-data") pod "45598b35-f5b0-49b6-95e1-cae26b23a10c" (UID: "45598b35-f5b0-49b6-95e1-cae26b23a10c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.031169 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45598b35-f5b0-49b6-95e1-cae26b23a10c" (UID: "45598b35-f5b0-49b6-95e1-cae26b23a10c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.095154 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.095405 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twmvq\" (UniqueName: \"kubernetes.io/projected/45598b35-f5b0-49b6-95e1-cae26b23a10c-kube-api-access-twmvq\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.095424 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.095436 4958 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/45598b35-f5b0-49b6-95e1-cae26b23a10c-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.221336 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-2fzfc" event={"ID":"45598b35-f5b0-49b6-95e1-cae26b23a10c","Type":"ContainerDied","Data":"70b87da7390fccbf1c276ab3e82caa78878c66fe90e0f1e5e93b75861625ad6b"} Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.221381 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b87da7390fccbf1c276ab3e82caa78878c66fe90e0f1e5e93b75861625ad6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.221399 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-2fzfc" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.545359 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 11:53:24 crc kubenswrapper[4958]: E1201 11:53:24.545950 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45598b35-f5b0-49b6-95e1-cae26b23a10c" containerName="manila-db-sync" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.545965 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="45598b35-f5b0-49b6-95e1-cae26b23a10c" containerName="manila-db-sync" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.546220 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="45598b35-f5b0-49b6-95e1-cae26b23a10c" containerName="manila-db-sync" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.547544 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.555346 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.555548 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-kpsck" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.555713 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.555837 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.557261 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.571640 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.576134 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.588430 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.595365 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.674157 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-87ff8cfc-d9c6b"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.676356 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.697519 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-87ff8cfc-d9c6b"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720319 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clf6\" (UniqueName: \"kubernetes.io/projected/94742f94-808d-4028-b0b9-8ee30e690e1f-kube-api-access-6clf6\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720367 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720407 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94742f94-808d-4028-b0b9-8ee30e690e1f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720428 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-ceph\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720467 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-config-data\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720522 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720558 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-config-data\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720589 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720634 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-scripts\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720650 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-scripts\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720689 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720703 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.720724 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2v6m\" (UniqueName: \"kubernetes.io/projected/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-kube-api-access-m2v6m\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.746031 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.750058 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.754203 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.755036 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825150 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-sb\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825412 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-config-data\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825457 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825491 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-config\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825531 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-scripts\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-scripts\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825587 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825603 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825629 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2v6m\" (UniqueName: \"kubernetes.io/projected/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-kube-api-access-m2v6m\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825652 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clf6\" (UniqueName: \"kubernetes.io/projected/94742f94-808d-4028-b0b9-8ee30e690e1f-kube-api-access-6clf6\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825673 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdmws\" (UniqueName: \"kubernetes.io/projected/bf8e137e-83e0-455f-bafe-f761932eaa60-kube-api-access-pdmws\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825699 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825736 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-nb\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825756 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94742f94-808d-4028-b0b9-8ee30e690e1f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825777 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-ceph\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825803 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825833 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-config-data\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.826328 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-dns-svc\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.826380 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.826459 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.826540 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94742f94-808d-4028-b0b9-8ee30e690e1f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.825829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.829546 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-scripts\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.829936 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.831016 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-config-data\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.831097 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-config-data\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.831857 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.833111 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-ceph\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.834473 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-scripts\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.846215 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94742f94-808d-4028-b0b9-8ee30e690e1f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.846285 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.846603 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clf6\" (UniqueName: \"kubernetes.io/projected/94742f94-808d-4028-b0b9-8ee30e690e1f-kube-api-access-6clf6\") pod \"manila-scheduler-0\" (UID: \"94742f94-808d-4028-b0b9-8ee30e690e1f\") " pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.847710 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2v6m\" (UniqueName: \"kubernetes.io/projected/2fb5bcc8-e1bd-4988-919d-d2dc05ed006b-kube-api-access-m2v6m\") pod \"manila-share-share1-0\" (UID: \"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b\") " pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.900711 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.916148 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928338 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928397 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-config\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928447 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-scripts\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928488 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmws\" (UniqueName: \"kubernetes.io/projected/bf8e137e-83e0-455f-bafe-f761932eaa60-kube-api-access-pdmws\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dz2\" (UniqueName: \"kubernetes.io/projected/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-kube-api-access-s7dz2\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-nb\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928596 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-config-data-custom\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928618 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-logs\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928649 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-dns-svc\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928678 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-config-data\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928717 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-etc-machine-id\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.928744 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-sb\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.931243 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-sb\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.931416 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-dns-svc\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.931678 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-nb\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.932060 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-config\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:24 crc kubenswrapper[4958]: I1201 11:53:24.963165 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmws\" (UniqueName: \"kubernetes.io/projected/bf8e137e-83e0-455f-bafe-f761932eaa60-kube-api-access-pdmws\") pod \"dnsmasq-dns-87ff8cfc-d9c6b\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.020564 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.030683 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-scripts\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.030798 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dz2\" (UniqueName: \"kubernetes.io/projected/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-kube-api-access-s7dz2\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.030893 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-config-data-custom\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.030921 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-logs\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.030980 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-config-data\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.031016 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-etc-machine-id\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.031093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.031185 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-etc-machine-id\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.031349 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-logs\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.034908 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-scripts\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.035239 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.035801 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-config-data\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.054594 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-config-data-custom\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.059792 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dz2\" (UniqueName: \"kubernetes.io/projected/a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7-kube-api-access-s7dz2\") pod \"manila-api-0\" (UID: \"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7\") " pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.090464 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.519065 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.847423 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 11:53:25 crc kubenswrapper[4958]: I1201 11:53:25.992258 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 11:53:25 crc kubenswrapper[4958]: W1201 11:53:25.998178 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0241e7b_f0bd_4b9e_9d82_edac7f5f70e7.slice/crio-2228f5ef39de4437e9e010a0fc969c4357f099e208a70935433e91ee112e6bfb WatchSource:0}: Error finding container 2228f5ef39de4437e9e010a0fc969c4357f099e208a70935433e91ee112e6bfb: Status 404 returned error can't find the container with id 2228f5ef39de4437e9e010a0fc969c4357f099e208a70935433e91ee112e6bfb Dec 01 11:53:26 crc kubenswrapper[4958]: I1201 11:53:26.034958 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-87ff8cfc-d9c6b"] Dec 01 11:53:26 crc kubenswrapper[4958]: I1201 11:53:26.249285 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" event={"ID":"bf8e137e-83e0-455f-bafe-f761932eaa60","Type":"ContainerStarted","Data":"9769c0d751628cfd43b60a8b123cce788f6ad2db506b68aa64b7a0dfad74c337"} Dec 01 11:53:26 crc kubenswrapper[4958]: I1201 11:53:26.253937 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b","Type":"ContainerStarted","Data":"21208eca489860928ca089f7d4dfbe6b47baf4af8e9905c82ec1b71a41e203c4"} Dec 01 11:53:26 crc kubenswrapper[4958]: I1201 11:53:26.255478 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"94742f94-808d-4028-b0b9-8ee30e690e1f","Type":"ContainerStarted","Data":"32dd45b5990b47b7bc115727733d99270d11105dd30df499c1d9917499194219"} Dec 01 11:53:26 crc kubenswrapper[4958]: I1201 11:53:26.257349 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7","Type":"ContainerStarted","Data":"2228f5ef39de4437e9e010a0fc969c4357f099e208a70935433e91ee112e6bfb"} Dec 01 11:53:27 crc kubenswrapper[4958]: I1201 11:53:27.378588 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"94742f94-808d-4028-b0b9-8ee30e690e1f","Type":"ContainerStarted","Data":"2a67cf93b5c0937ec3f43fdd09a302e940cae243895ff945ccf0c510a481dade"} Dec 01 11:53:27 crc kubenswrapper[4958]: I1201 11:53:27.386125 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7","Type":"ContainerStarted","Data":"0f37065e4f4f71e614c2d88573544a3ff7f19e1fdc55100bc13960e42bf2369e"} Dec 01 11:53:27 crc kubenswrapper[4958]: I1201 11:53:27.388931 4958 generic.go:334] "Generic (PLEG): container finished" podID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerID="c77663b9e9b294d6d4905ed581df0ac56fa93f58c7d898b0e6f1ed88217daeef" exitCode=0 Dec 01 11:53:27 crc kubenswrapper[4958]: I1201 11:53:27.388972 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" event={"ID":"bf8e137e-83e0-455f-bafe-f761932eaa60","Type":"ContainerDied","Data":"c77663b9e9b294d6d4905ed581df0ac56fa93f58c7d898b0e6f1ed88217daeef"} Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.212002 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.212287 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.405237 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"94742f94-808d-4028-b0b9-8ee30e690e1f","Type":"ContainerStarted","Data":"f4867743a3448437fb0482e75bd9e2f9a6641aace36c5251f2db95a647f804b0"} Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.409547 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7","Type":"ContainerStarted","Data":"5d7526d76d631a4660476a72f17a40826abb8e61d95e24aed856477c25db799c"} Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.410378 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.414707 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" event={"ID":"bf8e137e-83e0-455f-bafe-f761932eaa60","Type":"ContainerStarted","Data":"ba144724df744fd1ed699e67f1140a699ffcf03dfe101374b8d6ee3c8d799e22"} Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.414772 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.462699 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.462684126 podStartE2EDuration="4.462684126s" podCreationTimestamp="2025-12-01 11:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:53:28.461966066 +0000 UTC m=+6855.970755103" watchObservedRunningTime="2025-12-01 11:53:28.462684126 +0000 UTC m=+6855.971473163" Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.466563 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.711145962 podStartE2EDuration="4.466548646s" podCreationTimestamp="2025-12-01 11:53:24 +0000 UTC" firstStartedPulling="2025-12-01 11:53:25.51719213 +0000 UTC m=+6853.025981177" lastFinishedPulling="2025-12-01 11:53:26.272594824 +0000 UTC m=+6853.781383861" observedRunningTime="2025-12-01 11:53:28.440533867 +0000 UTC m=+6855.949322914" watchObservedRunningTime="2025-12-01 11:53:28.466548646 +0000 UTC m=+6855.975337683" Dec 01 11:53:28 crc kubenswrapper[4958]: I1201 11:53:28.491270 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" podStartSLOduration=4.491251267 podStartE2EDuration="4.491251267s" podCreationTimestamp="2025-12-01 11:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:53:28.484931098 +0000 UTC m=+6855.993720135" watchObservedRunningTime="2025-12-01 11:53:28.491251267 +0000 UTC m=+6856.000040304" Dec 01 11:53:34 crc kubenswrapper[4958]: I1201 11:53:34.550186 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b","Type":"ContainerStarted","Data":"dca658292dca1e3238ab179674c768caa8932864bed2d69eb487d8a666ad8a61"} Dec 01 11:53:34 crc kubenswrapper[4958]: I1201 11:53:34.550628 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2fb5bcc8-e1bd-4988-919d-d2dc05ed006b","Type":"ContainerStarted","Data":"ecbb002bebde619b9e32cb5b4d542b1c94fd0aa624e0b3b0d4a4964e8317162c"} Dec 01 11:53:34 crc kubenswrapper[4958]: I1201 11:53:34.585529 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.933490186 podStartE2EDuration="10.585512628s" podCreationTimestamp="2025-12-01 11:53:24 +0000 UTC" firstStartedPulling="2025-12-01 11:53:25.854577888 +0000 UTC m=+6853.363366915" lastFinishedPulling="2025-12-01 11:53:33.50660032 +0000 UTC m=+6861.015389357" observedRunningTime="2025-12-01 11:53:34.581772191 +0000 UTC m=+6862.090561228" watchObservedRunningTime="2025-12-01 11:53:34.585512628 +0000 UTC m=+6862.094301665" Dec 01 11:53:34 crc kubenswrapper[4958]: I1201 11:53:34.906105 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 01 11:53:34 crc kubenswrapper[4958]: I1201 11:53:34.916272 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.023189 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.106221 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c5cb9b577-kfwp4"] Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.106451 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerName="dnsmasq-dns" containerID="cri-o://2f099fd612b4b532b2f319f5b523be15b81d5eff9b8750f149bfdabba83f98f6" gracePeriod=10 Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.571542 4958 generic.go:334] "Generic (PLEG): container finished" podID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerID="2f099fd612b4b532b2f319f5b523be15b81d5eff9b8750f149bfdabba83f98f6" exitCode=0 Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.573081 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" event={"ID":"aebcbc29-628a-4001-aa42-8434eec96ebc","Type":"ContainerDied","Data":"2f099fd612b4b532b2f319f5b523be15b81d5eff9b8750f149bfdabba83f98f6"} Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.741022 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.788113 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-config\") pod \"aebcbc29-628a-4001-aa42-8434eec96ebc\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.788239 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-nb\") pod \"aebcbc29-628a-4001-aa42-8434eec96ebc\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.788265 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-sb\") pod \"aebcbc29-628a-4001-aa42-8434eec96ebc\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.788323 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-dns-svc\") pod \"aebcbc29-628a-4001-aa42-8434eec96ebc\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.788352 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt6jz\" (UniqueName: \"kubernetes.io/projected/aebcbc29-628a-4001-aa42-8434eec96ebc-kube-api-access-jt6jz\") pod \"aebcbc29-628a-4001-aa42-8434eec96ebc\" (UID: \"aebcbc29-628a-4001-aa42-8434eec96ebc\") " Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.837263 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebcbc29-628a-4001-aa42-8434eec96ebc-kube-api-access-jt6jz" (OuterVolumeSpecName: "kube-api-access-jt6jz") pod "aebcbc29-628a-4001-aa42-8434eec96ebc" (UID: "aebcbc29-628a-4001-aa42-8434eec96ebc"). InnerVolumeSpecName "kube-api-access-jt6jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.951086 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt6jz\" (UniqueName: \"kubernetes.io/projected/aebcbc29-628a-4001-aa42-8434eec96ebc-kube-api-access-jt6jz\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.956202 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-config" (OuterVolumeSpecName: "config") pod "aebcbc29-628a-4001-aa42-8434eec96ebc" (UID: "aebcbc29-628a-4001-aa42-8434eec96ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:53:35 crc kubenswrapper[4958]: I1201 11:53:35.982872 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aebcbc29-628a-4001-aa42-8434eec96ebc" (UID: "aebcbc29-628a-4001-aa42-8434eec96ebc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.002815 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aebcbc29-628a-4001-aa42-8434eec96ebc" (UID: "aebcbc29-628a-4001-aa42-8434eec96ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.019587 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aebcbc29-628a-4001-aa42-8434eec96ebc" (UID: "aebcbc29-628a-4001-aa42-8434eec96ebc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.054290 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.054337 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.054464 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.054481 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aebcbc29-628a-4001-aa42-8434eec96ebc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.586432 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" event={"ID":"aebcbc29-628a-4001-aa42-8434eec96ebc","Type":"ContainerDied","Data":"2e2828b0eb574efaf9c1ce9f71a05559e7c7a2043fafb1c7db32de64d5fe9da4"} Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.586503 4958 scope.go:117] "RemoveContainer" containerID="2f099fd612b4b532b2f319f5b523be15b81d5eff9b8750f149bfdabba83f98f6" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.587150 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c5cb9b577-kfwp4" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.615196 4958 scope.go:117] "RemoveContainer" containerID="c5af407014dadd8ae4487e03a9c35ac6cea1a92816885311f998510384e331b6" Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.628439 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c5cb9b577-kfwp4"] Dec 01 11:53:36 crc kubenswrapper[4958]: I1201 11:53:36.641564 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c5cb9b577-kfwp4"] Dec 01 11:53:37 crc kubenswrapper[4958]: I1201 11:53:37.842664 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" path="/var/lib/kubelet/pods/aebcbc29-628a-4001-aa42-8434eec96ebc/volumes" Dec 01 11:53:37 crc kubenswrapper[4958]: I1201 11:53:37.926957 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:53:37 crc kubenswrapper[4958]: I1201 11:53:37.927254 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-central-agent" containerID="cri-o://35a36952ed148e073ffd7d37463bb0ad6edb443d866abc800f1e2b522230e055" gracePeriod=30 Dec 01 11:53:37 crc kubenswrapper[4958]: I1201 11:53:37.927344 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="sg-core" containerID="cri-o://185d48e4a534dd3adefc724734af7615dfe4d05250b9d9420af9569f4de1987a" gracePeriod=30 Dec 01 11:53:37 crc kubenswrapper[4958]: I1201 11:53:37.927378 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="proxy-httpd" containerID="cri-o://3ae96a898072dcca1fb5c1ceebf9f5ef0d1e8b0ea86a0a50469c4275fd39daf9" gracePeriod=30 Dec 01 11:53:37 crc kubenswrapper[4958]: I1201 11:53:37.927367 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-notification-agent" containerID="cri-o://03c6b29b1726a9911e58a6ba1e47b30e4b41a1c357399ded76437e23fd50e2ae" gracePeriod=30 Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618093 4958 generic.go:334] "Generic (PLEG): container finished" podID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerID="3ae96a898072dcca1fb5c1ceebf9f5ef0d1e8b0ea86a0a50469c4275fd39daf9" exitCode=0 Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618430 4958 generic.go:334] "Generic (PLEG): container finished" podID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerID="185d48e4a534dd3adefc724734af7615dfe4d05250b9d9420af9569f4de1987a" exitCode=2 Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618438 4958 generic.go:334] "Generic (PLEG): container finished" podID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerID="03c6b29b1726a9911e58a6ba1e47b30e4b41a1c357399ded76437e23fd50e2ae" exitCode=0 Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618446 4958 generic.go:334] "Generic (PLEG): container finished" podID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerID="35a36952ed148e073ffd7d37463bb0ad6edb443d866abc800f1e2b522230e055" exitCode=0 Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618468 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerDied","Data":"3ae96a898072dcca1fb5c1ceebf9f5ef0d1e8b0ea86a0a50469c4275fd39daf9"} Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618498 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerDied","Data":"185d48e4a534dd3adefc724734af7615dfe4d05250b9d9420af9569f4de1987a"} Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618507 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerDied","Data":"03c6b29b1726a9911e58a6ba1e47b30e4b41a1c357399ded76437e23fd50e2ae"} Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.618516 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerDied","Data":"35a36952ed148e073ffd7d37463bb0ad6edb443d866abc800f1e2b522230e055"} Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.896781 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979345 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbwxd\" (UniqueName: \"kubernetes.io/projected/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-kube-api-access-fbwxd\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979441 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-scripts\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979547 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-combined-ca-bundle\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979639 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-run-httpd\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979691 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-log-httpd\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979743 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-config-data\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.979777 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-sg-core-conf-yaml\") pod \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\" (UID: \"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b\") " Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.983217 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.983650 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.985641 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-kube-api-access-fbwxd" (OuterVolumeSpecName: "kube-api-access-fbwxd") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "kube-api-access-fbwxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:53:38 crc kubenswrapper[4958]: I1201 11:53:38.988690 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-scripts" (OuterVolumeSpecName: "scripts") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.027086 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.083193 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbwxd\" (UniqueName: \"kubernetes.io/projected/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-kube-api-access-fbwxd\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.083219 4958 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.083229 4958 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.083238 4958 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.083437 4958 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.099094 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.135319 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-config-data" (OuterVolumeSpecName: "config-data") pod "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" (UID: "491a5f1b-7744-4aa2-bafc-12fd3eb3f34b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.186369 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.186398 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.630148 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"491a5f1b-7744-4aa2-bafc-12fd3eb3f34b","Type":"ContainerDied","Data":"aeb0d3f402d4965580222b6437188df10d49a04a03a33ac4fd03d684d7d60ac6"} Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.630205 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.630209 4958 scope.go:117] "RemoveContainer" containerID="3ae96a898072dcca1fb5c1ceebf9f5ef0d1e8b0ea86a0a50469c4275fd39daf9" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.655653 4958 scope.go:117] "RemoveContainer" containerID="185d48e4a534dd3adefc724734af7615dfe4d05250b9d9420af9569f4de1987a" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.691189 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.692995 4958 scope.go:117] "RemoveContainer" containerID="03c6b29b1726a9911e58a6ba1e47b30e4b41a1c357399ded76437e23fd50e2ae" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.714998 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.720499 4958 scope.go:117] "RemoveContainer" containerID="35a36952ed148e073ffd7d37463bb0ad6edb443d866abc800f1e2b522230e055" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731199 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:53:39 crc kubenswrapper[4958]: E1201 11:53:39.731799 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerName="dnsmasq-dns" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731822 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerName="dnsmasq-dns" Dec 01 11:53:39 crc kubenswrapper[4958]: E1201 11:53:39.731855 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-notification-agent" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731865 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-notification-agent" Dec 01 11:53:39 crc kubenswrapper[4958]: E1201 11:53:39.731892 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerName="init" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731900 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerName="init" Dec 01 11:53:39 crc kubenswrapper[4958]: E1201 11:53:39.731920 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-central-agent" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731927 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-central-agent" Dec 01 11:53:39 crc kubenswrapper[4958]: E1201 11:53:39.731950 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="proxy-httpd" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731955 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="proxy-httpd" Dec 01 11:53:39 crc kubenswrapper[4958]: E1201 11:53:39.731975 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="sg-core" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.731981 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="sg-core" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.732205 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="aebcbc29-628a-4001-aa42-8434eec96ebc" containerName="dnsmasq-dns" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.732216 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-central-agent" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.732231 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="ceilometer-notification-agent" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.732244 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="proxy-httpd" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.732254 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" containerName="sg-core" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.735455 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.737946 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.742596 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.758100 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.814757 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491a5f1b-7744-4aa2-bafc-12fd3eb3f34b" path="/var/lib/kubelet/pods/491a5f1b-7744-4aa2-bafc-12fd3eb3f34b/volumes" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.902580 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-scripts\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.902697 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pft8g\" (UniqueName: \"kubernetes.io/projected/a06735d0-753b-40ab-96cb-130eb67f9225-kube-api-access-pft8g\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.902824 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-config-data\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.903005 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06735d0-753b-40ab-96cb-130eb67f9225-run-httpd\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.903240 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.903274 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06735d0-753b-40ab-96cb-130eb67f9225-log-httpd\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:39 crc kubenswrapper[4958]: I1201 11:53:39.903294 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005415 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06735d0-753b-40ab-96cb-130eb67f9225-run-httpd\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005585 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005619 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06735d0-753b-40ab-96cb-130eb67f9225-log-httpd\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005646 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005738 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-scripts\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005767 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pft8g\" (UniqueName: \"kubernetes.io/projected/a06735d0-753b-40ab-96cb-130eb67f9225-kube-api-access-pft8g\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.005820 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-config-data\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.008321 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06735d0-753b-40ab-96cb-130eb67f9225-run-httpd\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.008645 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06735d0-753b-40ab-96cb-130eb67f9225-log-httpd\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.014344 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.014756 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.015556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-config-data\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.027597 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06735d0-753b-40ab-96cb-130eb67f9225-scripts\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.043210 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pft8g\" (UniqueName: \"kubernetes.io/projected/a06735d0-753b-40ab-96cb-130eb67f9225-kube-api-access-pft8g\") pod \"ceilometer-0\" (UID: \"a06735d0-753b-40ab-96cb-130eb67f9225\") " pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.089189 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 11:53:40 crc kubenswrapper[4958]: I1201 11:53:40.634509 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 11:53:40 crc kubenswrapper[4958]: W1201 11:53:40.639691 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda06735d0_753b_40ab_96cb_130eb67f9225.slice/crio-13e4f71430b2be03eba75473c659924130cf8eb2ed1bc2c97a590ff5e6aa6694 WatchSource:0}: Error finding container 13e4f71430b2be03eba75473c659924130cf8eb2ed1bc2c97a590ff5e6aa6694: Status 404 returned error can't find the container with id 13e4f71430b2be03eba75473c659924130cf8eb2ed1bc2c97a590ff5e6aa6694 Dec 01 11:53:41 crc kubenswrapper[4958]: I1201 11:53:41.673888 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06735d0-753b-40ab-96cb-130eb67f9225","Type":"ContainerStarted","Data":"13e4f71430b2be03eba75473c659924130cf8eb2ed1bc2c97a590ff5e6aa6694"} Dec 01 11:53:42 crc kubenswrapper[4958]: I1201 11:53:42.687348 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06735d0-753b-40ab-96cb-130eb67f9225","Type":"ContainerStarted","Data":"f6267747ee11d756cdd17b43bb3a6ceda556900e0788630a50efa4180e940eea"} Dec 01 11:53:42 crc kubenswrapper[4958]: I1201 11:53:42.687756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06735d0-753b-40ab-96cb-130eb67f9225","Type":"ContainerStarted","Data":"c22fcda6ee9c84be9bc35c225d1d4211e69243ab28f6a45cede10f56872cd028"} Dec 01 11:53:43 crc kubenswrapper[4958]: I1201 11:53:43.715911 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06735d0-753b-40ab-96cb-130eb67f9225","Type":"ContainerStarted","Data":"0da5556cf3c94e450e81da7f01c543142acfe729595594fc1d7a63317a4fe1f9"} Dec 01 11:53:45 crc kubenswrapper[4958]: I1201 11:53:45.848907 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06735d0-753b-40ab-96cb-130eb67f9225","Type":"ContainerStarted","Data":"adfa42eb93ce7296a43abd8a64e2b930783e35ad53e49c58aee7bc135daeca17"} Dec 01 11:53:45 crc kubenswrapper[4958]: I1201 11:53:45.851916 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 11:53:45 crc kubenswrapper[4958]: I1201 11:53:45.906180 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.680442825 podStartE2EDuration="6.906156763s" podCreationTimestamp="2025-12-01 11:53:39 +0000 UTC" firstStartedPulling="2025-12-01 11:53:40.646190845 +0000 UTC m=+6868.154979882" lastFinishedPulling="2025-12-01 11:53:44.871904783 +0000 UTC m=+6872.380693820" observedRunningTime="2025-12-01 11:53:45.88912538 +0000 UTC m=+6873.397914507" watchObservedRunningTime="2025-12-01 11:53:45.906156763 +0000 UTC m=+6873.414945800" Dec 01 11:53:46 crc kubenswrapper[4958]: I1201 11:53:46.644438 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 01 11:53:46 crc kubenswrapper[4958]: I1201 11:53:46.742947 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 01 11:53:46 crc kubenswrapper[4958]: I1201 11:53:46.963160 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 01 11:53:58 crc kubenswrapper[4958]: I1201 11:53:58.210118 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:53:58 crc kubenswrapper[4958]: I1201 11:53:58.210954 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:54:10 crc kubenswrapper[4958]: I1201 11:54:10.095588 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.211122 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.211695 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.211754 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.212789 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2f66a236e3c207fc7e3f3fd8d2b5bb72d1175a1c971bcc5a6d41f218a93b853"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.212887 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://f2f66a236e3c207fc7e3f3fd8d2b5bb72d1175a1c971bcc5a6d41f218a93b853" gracePeriod=600 Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.512610 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="f2f66a236e3c207fc7e3f3fd8d2b5bb72d1175a1c971bcc5a6d41f218a93b853" exitCode=0 Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.513147 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"f2f66a236e3c207fc7e3f3fd8d2b5bb72d1175a1c971bcc5a6d41f218a93b853"} Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.513190 4958 scope.go:117] "RemoveContainer" containerID="3be73af986bd1aeeec5f90854abf39b18185d3a55e94de8e8bb4c60131af7a28" Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.686563 4958 scope.go:117] "RemoveContainer" containerID="3c1e412f7dbc976857fa1fcc765688ccfafe240da6ecd120429f9b1933a6a5d6" Dec 01 11:54:28 crc kubenswrapper[4958]: I1201 11:54:28.885991 4958 scope.go:117] "RemoveContainer" containerID="f8487340023bd36ca8fa7628550caf59b7e08a558adb39f3a52beeecc6323087" Dec 01 11:54:29 crc kubenswrapper[4958]: I1201 11:54:29.528447 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d"} Dec 01 11:54:34 crc kubenswrapper[4958]: I1201 11:54:34.956689 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86c5578c4c-24npf"] Dec 01 11:54:34 crc kubenswrapper[4958]: I1201 11:54:34.959479 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:34 crc kubenswrapper[4958]: I1201 11:54:34.961708 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 01 11:54:34 crc kubenswrapper[4958]: I1201 11:54:34.975611 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c5578c4c-24npf"] Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.144320 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-dns-svc\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.144374 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-openstack-cell1\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.144632 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-sb\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.144697 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-config\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.144827 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-nb\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.145056 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9rf\" (UniqueName: \"kubernetes.io/projected/0ad1feeb-42e9-4b7a-91f3-b9392d534309-kube-api-access-sn9rf\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.246885 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-sb\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.246947 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-config\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.247065 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-nb\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.247144 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9rf\" (UniqueName: \"kubernetes.io/projected/0ad1feeb-42e9-4b7a-91f3-b9392d534309-kube-api-access-sn9rf\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.247252 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-dns-svc\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.247289 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-openstack-cell1\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.248035 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-config\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.248635 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-openstack-cell1\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.248681 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-sb\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.248968 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-dns-svc\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.249254 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-nb\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.273314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9rf\" (UniqueName: \"kubernetes.io/projected/0ad1feeb-42e9-4b7a-91f3-b9392d534309-kube-api-access-sn9rf\") pod \"dnsmasq-dns-86c5578c4c-24npf\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.290650 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:35 crc kubenswrapper[4958]: I1201 11:54:35.774245 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c5578c4c-24npf"] Dec 01 11:54:35 crc kubenswrapper[4958]: W1201 11:54:35.777153 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad1feeb_42e9_4b7a_91f3_b9392d534309.slice/crio-a6370556c016265e5e82ab3caf2f908cbc2738d58f9b7af0e1d8682c3f5da274 WatchSource:0}: Error finding container a6370556c016265e5e82ab3caf2f908cbc2738d58f9b7af0e1d8682c3f5da274: Status 404 returned error can't find the container with id a6370556c016265e5e82ab3caf2f908cbc2738d58f9b7af0e1d8682c3f5da274 Dec 01 11:54:36 crc kubenswrapper[4958]: I1201 11:54:36.040710 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-frscf"] Dec 01 11:54:36 crc kubenswrapper[4958]: I1201 11:54:36.055753 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-frscf"] Dec 01 11:54:36 crc kubenswrapper[4958]: I1201 11:54:36.619616 4958 generic.go:334] "Generic (PLEG): container finished" podID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerID="a7c46410c0961d7b1bcaddccabfc3521fe012f7b4dff73135a2ec123fcc58dab" exitCode=0 Dec 01 11:54:36 crc kubenswrapper[4958]: I1201 11:54:36.619701 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" event={"ID":"0ad1feeb-42e9-4b7a-91f3-b9392d534309","Type":"ContainerDied","Data":"a7c46410c0961d7b1bcaddccabfc3521fe012f7b4dff73135a2ec123fcc58dab"} Dec 01 11:54:36 crc kubenswrapper[4958]: I1201 11:54:36.619761 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" event={"ID":"0ad1feeb-42e9-4b7a-91f3-b9392d534309","Type":"ContainerStarted","Data":"a6370556c016265e5e82ab3caf2f908cbc2738d58f9b7af0e1d8682c3f5da274"} Dec 01 11:54:37 crc kubenswrapper[4958]: I1201 11:54:37.632472 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" event={"ID":"0ad1feeb-42e9-4b7a-91f3-b9392d534309","Type":"ContainerStarted","Data":"e5a8b86ff66a67e65e759480ef22ade57649c83e90230daf79d1977feeac66bb"} Dec 01 11:54:37 crc kubenswrapper[4958]: I1201 11:54:37.633244 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:37 crc kubenswrapper[4958]: I1201 11:54:37.660953 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" podStartSLOduration=3.660919175 podStartE2EDuration="3.660919175s" podCreationTimestamp="2025-12-01 11:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:54:37.649671486 +0000 UTC m=+6925.158460583" watchObservedRunningTime="2025-12-01 11:54:37.660919175 +0000 UTC m=+6925.169708252" Dec 01 11:54:37 crc kubenswrapper[4958]: I1201 11:54:37.813256 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de61ffcf-b1db-4649-8522-eccbdf42869f" path="/var/lib/kubelet/pods/de61ffcf-b1db-4649-8522-eccbdf42869f/volumes" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.293083 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.441420 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-87ff8cfc-d9c6b"] Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.441725 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerName="dnsmasq-dns" containerID="cri-o://ba144724df744fd1ed699e67f1140a699ffcf03dfe101374b8d6ee3c8d799e22" gracePeriod=10 Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.754023 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69944d679-t6rxn"] Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.759928 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.782063 4958 generic.go:334] "Generic (PLEG): container finished" podID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerID="ba144724df744fd1ed699e67f1140a699ffcf03dfe101374b8d6ee3c8d799e22" exitCode=0 Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.782119 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" event={"ID":"bf8e137e-83e0-455f-bafe-f761932eaa60","Type":"ContainerDied","Data":"ba144724df744fd1ed699e67f1140a699ffcf03dfe101374b8d6ee3c8d799e22"} Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.784276 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69944d679-t6rxn"] Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.947813 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-ovsdbserver-nb\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.947920 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-ovsdbserver-sb\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.947976 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-dns-svc\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.948040 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-openstack-cell1\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.948110 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzv8\" (UniqueName: \"kubernetes.io/projected/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-kube-api-access-qxzv8\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:45 crc kubenswrapper[4958]: I1201 11:54:45.948143 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-config\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.050950 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-ovsdbserver-sb\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.051044 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-dns-svc\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.051139 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-openstack-cell1\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.051207 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzv8\" (UniqueName: \"kubernetes.io/projected/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-kube-api-access-qxzv8\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.051244 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-config\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.051380 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-ovsdbserver-nb\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.053148 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-ovsdbserver-sb\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.053057 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-ovsdbserver-nb\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.054593 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-openstack-cell1\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.056258 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-dns-svc\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.056285 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-config\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.087632 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzv8\" (UniqueName: \"kubernetes.io/projected/e8ee90ac-8bc9-4ec5-a679-c12313b61f96-kube-api-access-qxzv8\") pod \"dnsmasq-dns-69944d679-t6rxn\" (UID: \"e8ee90ac-8bc9-4ec5-a679-c12313b61f96\") " pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.112671 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.653028 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69944d679-t6rxn"] Dec 01 11:54:46 crc kubenswrapper[4958]: I1201 11:54:46.798466 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69944d679-t6rxn" event={"ID":"e8ee90ac-8bc9-4ec5-a679-c12313b61f96","Type":"ContainerStarted","Data":"004275236f949220569aa6f5b0636c477ff09356a22f4f219515153439fdfce6"} Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.041047 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-00e0-account-create-27vdq"] Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.055903 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-00e0-account-create-27vdq"] Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.208434 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.338640 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdmws\" (UniqueName: \"kubernetes.io/projected/bf8e137e-83e0-455f-bafe-f761932eaa60-kube-api-access-pdmws\") pod \"bf8e137e-83e0-455f-bafe-f761932eaa60\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.338750 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-config\") pod \"bf8e137e-83e0-455f-bafe-f761932eaa60\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.338821 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-nb\") pod \"bf8e137e-83e0-455f-bafe-f761932eaa60\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.338870 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-dns-svc\") pod \"bf8e137e-83e0-455f-bafe-f761932eaa60\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.338910 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-sb\") pod \"bf8e137e-83e0-455f-bafe-f761932eaa60\" (UID: \"bf8e137e-83e0-455f-bafe-f761932eaa60\") " Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.344320 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8e137e-83e0-455f-bafe-f761932eaa60-kube-api-access-pdmws" (OuterVolumeSpecName: "kube-api-access-pdmws") pod "bf8e137e-83e0-455f-bafe-f761932eaa60" (UID: "bf8e137e-83e0-455f-bafe-f761932eaa60"). InnerVolumeSpecName "kube-api-access-pdmws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.397128 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf8e137e-83e0-455f-bafe-f761932eaa60" (UID: "bf8e137e-83e0-455f-bafe-f761932eaa60"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.400638 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf8e137e-83e0-455f-bafe-f761932eaa60" (UID: "bf8e137e-83e0-455f-bafe-f761932eaa60"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.407000 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf8e137e-83e0-455f-bafe-f761932eaa60" (UID: "bf8e137e-83e0-455f-bafe-f761932eaa60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.433719 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-config" (OuterVolumeSpecName: "config") pod "bf8e137e-83e0-455f-bafe-f761932eaa60" (UID: "bf8e137e-83e0-455f-bafe-f761932eaa60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.442670 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdmws\" (UniqueName: \"kubernetes.io/projected/bf8e137e-83e0-455f-bafe-f761932eaa60-kube-api-access-pdmws\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.442717 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.442731 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.442744 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.442756 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf8e137e-83e0-455f-bafe-f761932eaa60-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.811797 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ef053b-018e-437a-845e-b8b7737e4ff1" path="/var/lib/kubelet/pods/54ef053b-018e-437a-845e-b8b7737e4ff1/volumes" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.812546 4958 generic.go:334] "Generic (PLEG): container finished" podID="e8ee90ac-8bc9-4ec5-a679-c12313b61f96" containerID="091b103bca64a852d2993534b8b55e72931a20ae738578ce689ea5d8f9f24516" exitCode=0 Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.814206 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69944d679-t6rxn" event={"ID":"e8ee90ac-8bc9-4ec5-a679-c12313b61f96","Type":"ContainerDied","Data":"091b103bca64a852d2993534b8b55e72931a20ae738578ce689ea5d8f9f24516"} Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.821541 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" event={"ID":"bf8e137e-83e0-455f-bafe-f761932eaa60","Type":"ContainerDied","Data":"9769c0d751628cfd43b60a8b123cce788f6ad2db506b68aa64b7a0dfad74c337"} Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.821825 4958 scope.go:117] "RemoveContainer" containerID="ba144724df744fd1ed699e67f1140a699ffcf03dfe101374b8d6ee3c8d799e22" Dec 01 11:54:47 crc kubenswrapper[4958]: I1201 11:54:47.821632 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-87ff8cfc-d9c6b" Dec 01 11:54:48 crc kubenswrapper[4958]: I1201 11:54:48.002246 4958 scope.go:117] "RemoveContainer" containerID="c77663b9e9b294d6d4905ed581df0ac56fa93f58c7d898b0e6f1ed88217daeef" Dec 01 11:54:48 crc kubenswrapper[4958]: I1201 11:54:48.036649 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-87ff8cfc-d9c6b"] Dec 01 11:54:48 crc kubenswrapper[4958]: I1201 11:54:48.045871 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-87ff8cfc-d9c6b"] Dec 01 11:54:48 crc kubenswrapper[4958]: I1201 11:54:48.841022 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69944d679-t6rxn" event={"ID":"e8ee90ac-8bc9-4ec5-a679-c12313b61f96","Type":"ContainerStarted","Data":"f94ffd7a670311709eff6aab467a810c4932cefb04cb3801215742759967a2c2"} Dec 01 11:54:48 crc kubenswrapper[4958]: I1201 11:54:48.841581 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:48 crc kubenswrapper[4958]: I1201 11:54:48.867163 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69944d679-t6rxn" podStartSLOduration=3.867143215 podStartE2EDuration="3.867143215s" podCreationTimestamp="2025-12-01 11:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 11:54:48.86205797 +0000 UTC m=+6936.370847017" watchObservedRunningTime="2025-12-01 11:54:48.867143215 +0000 UTC m=+6936.375932242" Dec 01 11:54:49 crc kubenswrapper[4958]: I1201 11:54:49.810544 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" path="/var/lib/kubelet/pods/bf8e137e-83e0-455f-bafe-f761932eaa60/volumes" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.232549 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f"] Dec 01 11:54:52 crc kubenswrapper[4958]: E1201 11:54:52.233570 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerName="dnsmasq-dns" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.233591 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerName="dnsmasq-dns" Dec 01 11:54:52 crc kubenswrapper[4958]: E1201 11:54:52.233657 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerName="init" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.233668 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerName="init" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.234009 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8e137e-83e0-455f-bafe-f761932eaa60" containerName="dnsmasq-dns" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.234895 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.238857 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.238857 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.241232 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.241287 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.246190 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f"] Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.256241 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.256341 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qf5\" (UniqueName: \"kubernetes.io/projected/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-kube-api-access-s9qf5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.256406 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.256472 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.256504 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.357462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qf5\" (UniqueName: \"kubernetes.io/projected/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-kube-api-access-s9qf5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.357528 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.357948 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.357979 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.358103 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.364541 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.364593 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.364794 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.371152 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.374082 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qf5\" (UniqueName: \"kubernetes.io/projected/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-kube-api-access-s9qf5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:52 crc kubenswrapper[4958]: I1201 11:54:52.569635 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:54:53 crc kubenswrapper[4958]: I1201 11:54:53.057984 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-9kqxk"] Dec 01 11:54:53 crc kubenswrapper[4958]: I1201 11:54:53.076923 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-9kqxk"] Dec 01 11:54:53 crc kubenswrapper[4958]: I1201 11:54:53.205303 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f"] Dec 01 11:54:53 crc kubenswrapper[4958]: W1201 11:54:53.210173 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62cd4241_dbd3_4dfd_a141_78fc72e8b7a4.slice/crio-ec9c0795fdb52de005cbf5eef75c95db4f5c615410605f39b9e1e6880210faab WatchSource:0}: Error finding container ec9c0795fdb52de005cbf5eef75c95db4f5c615410605f39b9e1e6880210faab: Status 404 returned error can't find the container with id ec9c0795fdb52de005cbf5eef75c95db4f5c615410605f39b9e1e6880210faab Dec 01 11:54:53 crc kubenswrapper[4958]: I1201 11:54:53.213708 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 11:54:53 crc kubenswrapper[4958]: I1201 11:54:53.835683 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b877e60a-c39e-4736-b596-4132c5c8eda8" path="/var/lib/kubelet/pods/b877e60a-c39e-4736-b596-4132c5c8eda8/volumes" Dec 01 11:54:53 crc kubenswrapper[4958]: I1201 11:54:53.904362 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" event={"ID":"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4","Type":"ContainerStarted","Data":"ec9c0795fdb52de005cbf5eef75c95db4f5c615410605f39b9e1e6880210faab"} Dec 01 11:54:56 crc kubenswrapper[4958]: I1201 11:54:56.114221 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69944d679-t6rxn" Dec 01 11:54:56 crc kubenswrapper[4958]: I1201 11:54:56.186643 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86c5578c4c-24npf"] Dec 01 11:54:56 crc kubenswrapper[4958]: I1201 11:54:56.186963 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerName="dnsmasq-dns" containerID="cri-o://e5a8b86ff66a67e65e759480ef22ade57649c83e90230daf79d1977feeac66bb" gracePeriod=10 Dec 01 11:54:56 crc kubenswrapper[4958]: I1201 11:54:56.942939 4958 generic.go:334] "Generic (PLEG): container finished" podID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerID="e5a8b86ff66a67e65e759480ef22ade57649c83e90230daf79d1977feeac66bb" exitCode=0 Dec 01 11:54:56 crc kubenswrapper[4958]: I1201 11:54:56.943033 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" event={"ID":"0ad1feeb-42e9-4b7a-91f3-b9392d534309","Type":"ContainerDied","Data":"e5a8b86ff66a67e65e759480ef22ade57649c83e90230daf79d1977feeac66bb"} Dec 01 11:54:58 crc kubenswrapper[4958]: I1201 11:54:58.996588 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" event={"ID":"0ad1feeb-42e9-4b7a-91f3-b9392d534309","Type":"ContainerDied","Data":"a6370556c016265e5e82ab3caf2f908cbc2738d58f9b7af0e1d8682c3f5da274"} Dec 01 11:54:58 crc kubenswrapper[4958]: I1201 11:54:58.997424 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6370556c016265e5e82ab3caf2f908cbc2738d58f9b7af0e1d8682c3f5da274" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.047682 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.104666 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-sb\") pod \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.104887 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-nb\") pod \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.104933 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-dns-svc\") pod \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.104987 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn9rf\" (UniqueName: \"kubernetes.io/projected/0ad1feeb-42e9-4b7a-91f3-b9392d534309-kube-api-access-sn9rf\") pod \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.105110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-openstack-cell1\") pod \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.105198 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-config\") pod \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\" (UID: \"0ad1feeb-42e9-4b7a-91f3-b9392d534309\") " Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.134225 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad1feeb-42e9-4b7a-91f3-b9392d534309-kube-api-access-sn9rf" (OuterVolumeSpecName: "kube-api-access-sn9rf") pod "0ad1feeb-42e9-4b7a-91f3-b9392d534309" (UID: "0ad1feeb-42e9-4b7a-91f3-b9392d534309"). InnerVolumeSpecName "kube-api-access-sn9rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.172975 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ad1feeb-42e9-4b7a-91f3-b9392d534309" (UID: "0ad1feeb-42e9-4b7a-91f3-b9392d534309"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.178137 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ad1feeb-42e9-4b7a-91f3-b9392d534309" (UID: "0ad1feeb-42e9-4b7a-91f3-b9392d534309"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.184667 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ad1feeb-42e9-4b7a-91f3-b9392d534309" (UID: "0ad1feeb-42e9-4b7a-91f3-b9392d534309"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.192885 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "0ad1feeb-42e9-4b7a-91f3-b9392d534309" (UID: "0ad1feeb-42e9-4b7a-91f3-b9392d534309"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.207395 4958 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.207438 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.207450 4958 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.207465 4958 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.207478 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn9rf\" (UniqueName: \"kubernetes.io/projected/0ad1feeb-42e9-4b7a-91f3-b9392d534309-kube-api-access-sn9rf\") on node \"crc\" DevicePath \"\"" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.211941 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-config" (OuterVolumeSpecName: "config") pod "0ad1feeb-42e9-4b7a-91f3-b9392d534309" (UID: "0ad1feeb-42e9-4b7a-91f3-b9392d534309"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 11:54:59 crc kubenswrapper[4958]: I1201 11:54:59.312108 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad1feeb-42e9-4b7a-91f3-b9392d534309-config\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:00 crc kubenswrapper[4958]: I1201 11:55:00.008645 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c5578c4c-24npf" Dec 01 11:55:00 crc kubenswrapper[4958]: I1201 11:55:00.042644 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86c5578c4c-24npf"] Dec 01 11:55:00 crc kubenswrapper[4958]: I1201 11:55:00.055434 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86c5578c4c-24npf"] Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.261694 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qt9cw"] Dec 01 11:55:01 crc kubenswrapper[4958]: E1201 11:55:01.262656 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerName="dnsmasq-dns" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.262673 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerName="dnsmasq-dns" Dec 01 11:55:01 crc kubenswrapper[4958]: E1201 11:55:01.262700 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerName="init" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.262706 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerName="init" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.262969 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" containerName="dnsmasq-dns" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.264689 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.280361 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt9cw"] Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.356324 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5lc\" (UniqueName: \"kubernetes.io/projected/d59a800d-8d38-4299-9f56-34238fc7746c-kube-api-access-cf5lc\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.356607 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-utilities\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.356799 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-catalog-content\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.458817 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5lc\" (UniqueName: \"kubernetes.io/projected/d59a800d-8d38-4299-9f56-34238fc7746c-kube-api-access-cf5lc\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.458989 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-utilities\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.459076 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-catalog-content\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.459586 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-catalog-content\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.459592 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-utilities\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.506604 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5lc\" (UniqueName: \"kubernetes.io/projected/d59a800d-8d38-4299-9f56-34238fc7746c-kube-api-access-cf5lc\") pod \"redhat-marketplace-qt9cw\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.612286 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:01 crc kubenswrapper[4958]: I1201 11:55:01.814259 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad1feeb-42e9-4b7a-91f3-b9392d534309" path="/var/lib/kubelet/pods/0ad1feeb-42e9-4b7a-91f3-b9392d534309/volumes" Dec 01 11:55:04 crc kubenswrapper[4958]: I1201 11:55:04.040013 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-a8b4-account-create-x8xc2"] Dec 01 11:55:04 crc kubenswrapper[4958]: I1201 11:55:04.056274 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-a8b4-account-create-x8xc2"] Dec 01 11:55:05 crc kubenswrapper[4958]: I1201 11:55:05.811171 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f02d2c8-00f2-4441-b4e4-30c870db359d" path="/var/lib/kubelet/pods/3f02d2c8-00f2-4441-b4e4-30c870db359d/volumes" Dec 01 11:55:07 crc kubenswrapper[4958]: W1201 11:55:07.338496 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd59a800d_8d38_4299_9f56_34238fc7746c.slice/crio-29e25d41b491635016175435348e7709a63c7c56ba19651881ebc25d43ceef35 WatchSource:0}: Error finding container 29e25d41b491635016175435348e7709a63c7c56ba19651881ebc25d43ceef35: Status 404 returned error can't find the container with id 29e25d41b491635016175435348e7709a63c7c56ba19651881ebc25d43ceef35 Dec 01 11:55:07 crc kubenswrapper[4958]: I1201 11:55:07.340686 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt9cw"] Dec 01 11:55:08 crc kubenswrapper[4958]: I1201 11:55:08.189655 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" event={"ID":"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4","Type":"ContainerStarted","Data":"59afc59d3d149d303d53cfa60aeb04a9280524263d9c8ae51a04b0e1e27994bb"} Dec 01 11:55:08 crc kubenswrapper[4958]: I1201 11:55:08.192332 4958 generic.go:334] "Generic (PLEG): container finished" podID="d59a800d-8d38-4299-9f56-34238fc7746c" containerID="aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc" exitCode=0 Dec 01 11:55:08 crc kubenswrapper[4958]: I1201 11:55:08.192397 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt9cw" event={"ID":"d59a800d-8d38-4299-9f56-34238fc7746c","Type":"ContainerDied","Data":"aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc"} Dec 01 11:55:08 crc kubenswrapper[4958]: I1201 11:55:08.192432 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt9cw" event={"ID":"d59a800d-8d38-4299-9f56-34238fc7746c","Type":"ContainerStarted","Data":"29e25d41b491635016175435348e7709a63c7c56ba19651881ebc25d43ceef35"} Dec 01 11:55:08 crc kubenswrapper[4958]: I1201 11:55:08.224873 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" podStartSLOduration=2.351344188 podStartE2EDuration="16.224822352s" podCreationTimestamp="2025-12-01 11:54:52 +0000 UTC" firstStartedPulling="2025-12-01 11:54:53.213283161 +0000 UTC m=+6940.722072218" lastFinishedPulling="2025-12-01 11:55:07.086761345 +0000 UTC m=+6954.595550382" observedRunningTime="2025-12-01 11:55:08.222691812 +0000 UTC m=+6955.731480879" watchObservedRunningTime="2025-12-01 11:55:08.224822352 +0000 UTC m=+6955.733611389" Dec 01 11:55:10 crc kubenswrapper[4958]: I1201 11:55:10.216702 4958 generic.go:334] "Generic (PLEG): container finished" podID="d59a800d-8d38-4299-9f56-34238fc7746c" containerID="26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f" exitCode=0 Dec 01 11:55:10 crc kubenswrapper[4958]: I1201 11:55:10.217142 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt9cw" event={"ID":"d59a800d-8d38-4299-9f56-34238fc7746c","Type":"ContainerDied","Data":"26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f"} Dec 01 11:55:11 crc kubenswrapper[4958]: I1201 11:55:11.234291 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt9cw" event={"ID":"d59a800d-8d38-4299-9f56-34238fc7746c","Type":"ContainerStarted","Data":"321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53"} Dec 01 11:55:11 crc kubenswrapper[4958]: I1201 11:55:11.292032 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qt9cw" podStartSLOduration=7.726936606 podStartE2EDuration="10.292008482s" podCreationTimestamp="2025-12-01 11:55:01 +0000 UTC" firstStartedPulling="2025-12-01 11:55:08.195122059 +0000 UTC m=+6955.703911096" lastFinishedPulling="2025-12-01 11:55:10.760193935 +0000 UTC m=+6958.268982972" observedRunningTime="2025-12-01 11:55:11.287272227 +0000 UTC m=+6958.796061264" watchObservedRunningTime="2025-12-01 11:55:11.292008482 +0000 UTC m=+6958.800797519" Dec 01 11:55:11 crc kubenswrapper[4958]: I1201 11:55:11.613622 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:11 crc kubenswrapper[4958]: I1201 11:55:11.613674 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:12 crc kubenswrapper[4958]: I1201 11:55:12.670356 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-qt9cw" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="registry-server" probeResult="failure" output=< Dec 01 11:55:12 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:55:12 crc kubenswrapper[4958]: > Dec 01 11:55:17 crc kubenswrapper[4958]: I1201 11:55:17.975406 4958 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbf8e137e-83e0-455f-bafe-f761932eaa60"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbf8e137e-83e0-455f-bafe-f761932eaa60] : Timed out while waiting for systemd to remove kubepods-besteffort-podbf8e137e_83e0_455f_bafe_f761932eaa60.slice" Dec 01 11:55:21 crc kubenswrapper[4958]: I1201 11:55:21.394176 4958 generic.go:334] "Generic (PLEG): container finished" podID="62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" containerID="59afc59d3d149d303d53cfa60aeb04a9280524263d9c8ae51a04b0e1e27994bb" exitCode=0 Dec 01 11:55:21 crc kubenswrapper[4958]: I1201 11:55:21.394245 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" event={"ID":"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4","Type":"ContainerDied","Data":"59afc59d3d149d303d53cfa60aeb04a9280524263d9c8ae51a04b0e1e27994bb"} Dec 01 11:55:21 crc kubenswrapper[4958]: I1201 11:55:21.683017 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:21 crc kubenswrapper[4958]: I1201 11:55:21.747171 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:21 crc kubenswrapper[4958]: I1201 11:55:21.925250 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt9cw"] Dec 01 11:55:22 crc kubenswrapper[4958]: I1201 11:55:22.888990 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.061158 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-inventory\") pod \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.061349 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qf5\" (UniqueName: \"kubernetes.io/projected/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-kube-api-access-s9qf5\") pod \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.061415 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ceph\") pod \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.061472 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-pre-adoption-validation-combined-ca-bundle\") pod \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.061536 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ssh-key\") pod \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\" (UID: \"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4\") " Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.066941 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-kube-api-access-s9qf5" (OuterVolumeSpecName: "kube-api-access-s9qf5") pod "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" (UID: "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4"). InnerVolumeSpecName "kube-api-access-s9qf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.069564 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" (UID: "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.071216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ceph" (OuterVolumeSpecName: "ceph") pod "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" (UID: "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.093758 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-inventory" (OuterVolumeSpecName: "inventory") pod "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" (UID: "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.117948 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" (UID: "62cd4241-dbd3-4dfd-a141-78fc72e8b7a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.164667 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.164698 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qf5\" (UniqueName: \"kubernetes.io/projected/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-kube-api-access-s9qf5\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.164711 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.164720 4958 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.164731 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62cd4241-dbd3-4dfd-a141-78fc72e8b7a4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.425561 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" event={"ID":"62cd4241-dbd3-4dfd-a141-78fc72e8b7a4","Type":"ContainerDied","Data":"ec9c0795fdb52de005cbf5eef75c95db4f5c615410605f39b9e1e6880210faab"} Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.425667 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9c0795fdb52de005cbf5eef75c95db4f5c615410605f39b9e1e6880210faab" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.429392 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qt9cw" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="registry-server" containerID="cri-o://321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53" gracePeriod=2 Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.429626 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f" Dec 01 11:55:23 crc kubenswrapper[4958]: I1201 11:55:23.959019 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.089610 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-utilities\") pod \"d59a800d-8d38-4299-9f56-34238fc7746c\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.089753 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5lc\" (UniqueName: \"kubernetes.io/projected/d59a800d-8d38-4299-9f56-34238fc7746c-kube-api-access-cf5lc\") pod \"d59a800d-8d38-4299-9f56-34238fc7746c\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.089827 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-catalog-content\") pod \"d59a800d-8d38-4299-9f56-34238fc7746c\" (UID: \"d59a800d-8d38-4299-9f56-34238fc7746c\") " Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.091062 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-utilities" (OuterVolumeSpecName: "utilities") pod "d59a800d-8d38-4299-9f56-34238fc7746c" (UID: "d59a800d-8d38-4299-9f56-34238fc7746c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.095591 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59a800d-8d38-4299-9f56-34238fc7746c-kube-api-access-cf5lc" (OuterVolumeSpecName: "kube-api-access-cf5lc") pod "d59a800d-8d38-4299-9f56-34238fc7746c" (UID: "d59a800d-8d38-4299-9f56-34238fc7746c"). InnerVolumeSpecName "kube-api-access-cf5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.119213 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d59a800d-8d38-4299-9f56-34238fc7746c" (UID: "d59a800d-8d38-4299-9f56-34238fc7746c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.194761 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.194804 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5lc\" (UniqueName: \"kubernetes.io/projected/d59a800d-8d38-4299-9f56-34238fc7746c-kube-api-access-cf5lc\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.194836 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d59a800d-8d38-4299-9f56-34238fc7746c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.438885 4958 generic.go:334] "Generic (PLEG): container finished" podID="d59a800d-8d38-4299-9f56-34238fc7746c" containerID="321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53" exitCode=0 Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.438939 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt9cw" event={"ID":"d59a800d-8d38-4299-9f56-34238fc7746c","Type":"ContainerDied","Data":"321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53"} Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.438948 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qt9cw" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.438974 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qt9cw" event={"ID":"d59a800d-8d38-4299-9f56-34238fc7746c","Type":"ContainerDied","Data":"29e25d41b491635016175435348e7709a63c7c56ba19651881ebc25d43ceef35"} Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.438991 4958 scope.go:117] "RemoveContainer" containerID="321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.465085 4958 scope.go:117] "RemoveContainer" containerID="26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.477019 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt9cw"] Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.484652 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qt9cw"] Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.496456 4958 scope.go:117] "RemoveContainer" containerID="aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.552017 4958 scope.go:117] "RemoveContainer" containerID="321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53" Dec 01 11:55:24 crc kubenswrapper[4958]: E1201 11:55:24.552893 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53\": container with ID starting with 321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53 not found: ID does not exist" containerID="321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.552924 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53"} err="failed to get container status \"321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53\": rpc error: code = NotFound desc = could not find container \"321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53\": container with ID starting with 321601a25804956c02f8a4277d8230fb1c7a014e81d52bf3ae59e77a653d1e53 not found: ID does not exist" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.552945 4958 scope.go:117] "RemoveContainer" containerID="26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f" Dec 01 11:55:24 crc kubenswrapper[4958]: E1201 11:55:24.553219 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f\": container with ID starting with 26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f not found: ID does not exist" containerID="26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.553242 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f"} err="failed to get container status \"26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f\": rpc error: code = NotFound desc = could not find container \"26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f\": container with ID starting with 26800d606917bacf3d9dc2548819fbf5a4561e7c3173390c7192f8184c0e782f not found: ID does not exist" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.553255 4958 scope.go:117] "RemoveContainer" containerID="aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc" Dec 01 11:55:24 crc kubenswrapper[4958]: E1201 11:55:24.553484 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc\": container with ID starting with aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc not found: ID does not exist" containerID="aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc" Dec 01 11:55:24 crc kubenswrapper[4958]: I1201 11:55:24.553517 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc"} err="failed to get container status \"aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc\": rpc error: code = NotFound desc = could not find container \"aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc\": container with ID starting with aa629670436672a55f32196260379290af279a9597986b85232b568b8f435dcc not found: ID does not exist" Dec 01 11:55:25 crc kubenswrapper[4958]: I1201 11:55:25.834187 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" path="/var/lib/kubelet/pods/d59a800d-8d38-4299-9f56-34238fc7746c/volumes" Dec 01 11:55:29 crc kubenswrapper[4958]: I1201 11:55:29.000789 4958 scope.go:117] "RemoveContainer" containerID="5da01f982817a71d01781cea60a1a8e7a109f0bbdf407734abd63cc7b68723e6" Dec 01 11:55:29 crc kubenswrapper[4958]: I1201 11:55:29.248719 4958 scope.go:117] "RemoveContainer" containerID="309c1f5bd8aaad2357baf8d68d146cd80a4a9e71e2ea3eec8b47fe85481cbe7d" Dec 01 11:55:29 crc kubenswrapper[4958]: I1201 11:55:29.281211 4958 scope.go:117] "RemoveContainer" containerID="3ca0a9a0ff87991e21d0dcccd5183fedb193bf38f593c5acebdc7410aefa91df" Dec 01 11:55:29 crc kubenswrapper[4958]: I1201 11:55:29.339456 4958 scope.go:117] "RemoveContainer" containerID="2580d952f013912946c78e2cbbc67769e55b1ec3539fe38e19a2a54eded105b5" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.117126 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p"] Dec 01 11:55:30 crc kubenswrapper[4958]: E1201 11:55:30.117886 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.117915 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 01 11:55:30 crc kubenswrapper[4958]: E1201 11:55:30.117938 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="registry-server" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.117949 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="registry-server" Dec 01 11:55:30 crc kubenswrapper[4958]: E1201 11:55:30.117978 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="extract-utilities" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.117990 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="extract-utilities" Dec 01 11:55:30 crc kubenswrapper[4958]: E1201 11:55:30.118015 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="extract-content" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.118029 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="extract-content" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.118484 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="62cd4241-dbd3-4dfd-a141-78fc72e8b7a4" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.118530 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59a800d-8d38-4299-9f56-34238fc7746c" containerName="registry-server" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.119915 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.151352 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.151673 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.151801 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.151935 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.158014 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p"] Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.283627 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.283759 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.283827 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.283872 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvl88\" (UniqueName: \"kubernetes.io/projected/47a028df-7124-41b2-b3ed-0f25905f265b-kube-api-access-jvl88\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.284039 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.386606 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.386730 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.386768 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.386809 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvl88\" (UniqueName: \"kubernetes.io/projected/47a028df-7124-41b2-b3ed-0f25905f265b-kube-api-access-jvl88\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.386887 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.393039 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.400439 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.401575 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.402801 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.404637 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvl88\" (UniqueName: \"kubernetes.io/projected/47a028df-7124-41b2-b3ed-0f25905f265b-kube-api-access-jvl88\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:30 crc kubenswrapper[4958]: I1201 11:55:30.477726 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 11:55:31 crc kubenswrapper[4958]: I1201 11:55:31.087529 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p"] Dec 01 11:55:31 crc kubenswrapper[4958]: I1201 11:55:31.547921 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" event={"ID":"47a028df-7124-41b2-b3ed-0f25905f265b","Type":"ContainerStarted","Data":"d44e358469cdda21b1bb3c1db1cbacbe78dd96a5803ecc79711af392534fec35"} Dec 01 11:55:31 crc kubenswrapper[4958]: I1201 11:55:31.548160 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" event={"ID":"47a028df-7124-41b2-b3ed-0f25905f265b","Type":"ContainerStarted","Data":"cdefc12d4116665e38d6edd33a61ae8cde911151e4ce70d39996ee28a05c7ac7"} Dec 01 11:55:31 crc kubenswrapper[4958]: I1201 11:55:31.568114 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" podStartSLOduration=1.420440918 podStartE2EDuration="1.56808905s" podCreationTimestamp="2025-12-01 11:55:30 +0000 UTC" firstStartedPulling="2025-12-01 11:55:31.103036238 +0000 UTC m=+6978.611825285" lastFinishedPulling="2025-12-01 11:55:31.25068438 +0000 UTC m=+6978.759473417" observedRunningTime="2025-12-01 11:55:31.563478769 +0000 UTC m=+6979.072267826" watchObservedRunningTime="2025-12-01 11:55:31.56808905 +0000 UTC m=+6979.076878087" Dec 01 11:55:41 crc kubenswrapper[4958]: I1201 11:55:41.061697 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-r9jsq"] Dec 01 11:55:41 crc kubenswrapper[4958]: I1201 11:55:41.070763 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-r9jsq"] Dec 01 11:55:41 crc kubenswrapper[4958]: I1201 11:55:41.815386 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b2c2cb-9590-403a-9833-67d909640ca6" path="/var/lib/kubelet/pods/e2b2c2cb-9590-403a-9833-67d909640ca6/volumes" Dec 01 11:56:28 crc kubenswrapper[4958]: I1201 11:56:28.210447 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:56:28 crc kubenswrapper[4958]: I1201 11:56:28.212000 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:56:29 crc kubenswrapper[4958]: I1201 11:56:29.537868 4958 scope.go:117] "RemoveContainer" containerID="da1e2215781331137ea919ac2ae9b4f0b0e99d26696c917ebf7429cfa3ad8c89" Dec 01 11:56:29 crc kubenswrapper[4958]: I1201 11:56:29.587229 4958 scope.go:117] "RemoveContainer" containerID="f69d09c32395908db469fb1c0ae9e4ab0af16606705ba76e3fa9838518b539c6" Dec 01 11:56:58 crc kubenswrapper[4958]: I1201 11:56:58.210666 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:56:58 crc kubenswrapper[4958]: I1201 11:56:58.211287 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:57:28 crc kubenswrapper[4958]: I1201 11:57:28.210484 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 11:57:28 crc kubenswrapper[4958]: I1201 11:57:28.212642 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 11:57:28 crc kubenswrapper[4958]: I1201 11:57:28.212807 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 11:57:28 crc kubenswrapper[4958]: I1201 11:57:28.213965 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 11:57:28 crc kubenswrapper[4958]: I1201 11:57:28.214166 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" gracePeriod=600 Dec 01 11:57:28 crc kubenswrapper[4958]: E1201 11:57:28.354387 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:57:29 crc kubenswrapper[4958]: I1201 11:57:29.291469 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" exitCode=0 Dec 01 11:57:29 crc kubenswrapper[4958]: I1201 11:57:29.291533 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d"} Dec 01 11:57:29 crc kubenswrapper[4958]: I1201 11:57:29.291762 4958 scope.go:117] "RemoveContainer" containerID="f2f66a236e3c207fc7e3f3fd8d2b5bb72d1175a1c971bcc5a6d41f218a93b853" Dec 01 11:57:29 crc kubenswrapper[4958]: I1201 11:57:29.293052 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:57:29 crc kubenswrapper[4958]: E1201 11:57:29.293784 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:57:43 crc kubenswrapper[4958]: I1201 11:57:43.815265 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:57:43 crc kubenswrapper[4958]: E1201 11:57:43.816155 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:57:54 crc kubenswrapper[4958]: I1201 11:57:54.797491 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:57:54 crc kubenswrapper[4958]: E1201 11:57:54.798513 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:58:06 crc kubenswrapper[4958]: I1201 11:58:06.797895 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:58:06 crc kubenswrapper[4958]: E1201 11:58:06.798808 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:58:21 crc kubenswrapper[4958]: I1201 11:58:21.798911 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:58:21 crc kubenswrapper[4958]: E1201 11:58:21.800311 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:58:36 crc kubenswrapper[4958]: I1201 11:58:36.799912 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:58:36 crc kubenswrapper[4958]: E1201 11:58:36.801325 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.133655 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-thl8k"] Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.137794 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.151879 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-thl8k"] Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.284548 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-catalog-content\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.284778 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6bph\" (UniqueName: \"kubernetes.io/projected/177eb3e2-df29-49b4-b361-d37f54eac073-kube-api-access-m6bph\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.285080 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-utilities\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.387309 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6bph\" (UniqueName: \"kubernetes.io/projected/177eb3e2-df29-49b4-b361-d37f54eac073-kube-api-access-m6bph\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.387545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-utilities\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.387764 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-catalog-content\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.388415 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-utilities\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.388465 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-catalog-content\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.412889 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6bph\" (UniqueName: \"kubernetes.io/projected/177eb3e2-df29-49b4-b361-d37f54eac073-kube-api-access-m6bph\") pod \"redhat-operators-thl8k\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.467129 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.797562 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:58:51 crc kubenswrapper[4958]: E1201 11:58:51.798101 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:58:51 crc kubenswrapper[4958]: I1201 11:58:51.994590 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-thl8k"] Dec 01 11:58:52 crc kubenswrapper[4958]: I1201 11:58:52.384923 4958 generic.go:334] "Generic (PLEG): container finished" podID="177eb3e2-df29-49b4-b361-d37f54eac073" containerID="68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303" exitCode=0 Dec 01 11:58:52 crc kubenswrapper[4958]: I1201 11:58:52.384972 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerDied","Data":"68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303"} Dec 01 11:58:52 crc kubenswrapper[4958]: I1201 11:58:52.385000 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerStarted","Data":"df417b02ed47a43f61b2488c20db9541b6052b666edadcce067ebd826955fd53"} Dec 01 11:58:54 crc kubenswrapper[4958]: I1201 11:58:54.413381 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerStarted","Data":"6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e"} Dec 01 11:58:57 crc kubenswrapper[4958]: I1201 11:58:57.451247 4958 generic.go:334] "Generic (PLEG): container finished" podID="177eb3e2-df29-49b4-b361-d37f54eac073" containerID="6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e" exitCode=0 Dec 01 11:58:57 crc kubenswrapper[4958]: I1201 11:58:57.451346 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerDied","Data":"6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e"} Dec 01 11:58:59 crc kubenswrapper[4958]: I1201 11:58:59.481264 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerStarted","Data":"8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2"} Dec 01 11:58:59 crc kubenswrapper[4958]: I1201 11:58:59.506932 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-thl8k" podStartSLOduration=2.742756487 podStartE2EDuration="8.506910682s" podCreationTimestamp="2025-12-01 11:58:51 +0000 UTC" firstStartedPulling="2025-12-01 11:58:52.38682929 +0000 UTC m=+7179.895618327" lastFinishedPulling="2025-12-01 11:58:58.150983485 +0000 UTC m=+7185.659772522" observedRunningTime="2025-12-01 11:58:59.50611765 +0000 UTC m=+7187.014906697" watchObservedRunningTime="2025-12-01 11:58:59.506910682 +0000 UTC m=+7187.015699719" Dec 01 11:59:01 crc kubenswrapper[4958]: I1201 11:59:01.468057 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:59:01 crc kubenswrapper[4958]: I1201 11:59:01.468371 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:59:02 crc kubenswrapper[4958]: I1201 11:59:02.519970 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-thl8k" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="registry-server" probeResult="failure" output=< Dec 01 11:59:02 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 11:59:02 crc kubenswrapper[4958]: > Dec 01 11:59:05 crc kubenswrapper[4958]: I1201 11:59:05.797977 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:59:05 crc kubenswrapper[4958]: E1201 11:59:05.798934 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:59:11 crc kubenswrapper[4958]: I1201 11:59:11.554565 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:59:11 crc kubenswrapper[4958]: I1201 11:59:11.627115 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:59:11 crc kubenswrapper[4958]: I1201 11:59:11.814112 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-thl8k"] Dec 01 11:59:12 crc kubenswrapper[4958]: I1201 11:59:12.618862 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-thl8k" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="registry-server" containerID="cri-o://8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2" gracePeriod=2 Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.141288 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.273659 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-catalog-content\") pod \"177eb3e2-df29-49b4-b361-d37f54eac073\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.273816 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6bph\" (UniqueName: \"kubernetes.io/projected/177eb3e2-df29-49b4-b361-d37f54eac073-kube-api-access-m6bph\") pod \"177eb3e2-df29-49b4-b361-d37f54eac073\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.274999 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-utilities\") pod \"177eb3e2-df29-49b4-b361-d37f54eac073\" (UID: \"177eb3e2-df29-49b4-b361-d37f54eac073\") " Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.276122 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-utilities" (OuterVolumeSpecName: "utilities") pod "177eb3e2-df29-49b4-b361-d37f54eac073" (UID: "177eb3e2-df29-49b4-b361-d37f54eac073"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.285837 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177eb3e2-df29-49b4-b361-d37f54eac073-kube-api-access-m6bph" (OuterVolumeSpecName: "kube-api-access-m6bph") pod "177eb3e2-df29-49b4-b361-d37f54eac073" (UID: "177eb3e2-df29-49b4-b361-d37f54eac073"). InnerVolumeSpecName "kube-api-access-m6bph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.378119 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6bph\" (UniqueName: \"kubernetes.io/projected/177eb3e2-df29-49b4-b361-d37f54eac073-kube-api-access-m6bph\") on node \"crc\" DevicePath \"\"" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.378151 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.443505 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "177eb3e2-df29-49b4-b361-d37f54eac073" (UID: "177eb3e2-df29-49b4-b361-d37f54eac073"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.480143 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177eb3e2-df29-49b4-b361-d37f54eac073-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.635102 4958 generic.go:334] "Generic (PLEG): container finished" podID="177eb3e2-df29-49b4-b361-d37f54eac073" containerID="8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2" exitCode=0 Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.635217 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerDied","Data":"8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2"} Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.635551 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-thl8k" event={"ID":"177eb3e2-df29-49b4-b361-d37f54eac073","Type":"ContainerDied","Data":"df417b02ed47a43f61b2488c20db9541b6052b666edadcce067ebd826955fd53"} Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.635584 4958 scope.go:117] "RemoveContainer" containerID="8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.635275 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-thl8k" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.673087 4958 scope.go:117] "RemoveContainer" containerID="6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.682044 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-thl8k"] Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.693824 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-thl8k"] Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.712121 4958 scope.go:117] "RemoveContainer" containerID="68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.771776 4958 scope.go:117] "RemoveContainer" containerID="8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2" Dec 01 11:59:13 crc kubenswrapper[4958]: E1201 11:59:13.772371 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2\": container with ID starting with 8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2 not found: ID does not exist" containerID="8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.772419 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2"} err="failed to get container status \"8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2\": rpc error: code = NotFound desc = could not find container \"8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2\": container with ID starting with 8af3e5088af945047e4d7157b1c60121298af3300b171778298db5dd743409d2 not found: ID does not exist" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.772456 4958 scope.go:117] "RemoveContainer" containerID="6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e" Dec 01 11:59:13 crc kubenswrapper[4958]: E1201 11:59:13.772966 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e\": container with ID starting with 6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e not found: ID does not exist" containerID="6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.772993 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e"} err="failed to get container status \"6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e\": rpc error: code = NotFound desc = could not find container \"6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e\": container with ID starting with 6bb417a823b4638883562272349bd66775d226d24008767bc1b2c389bea0cf1e not found: ID does not exist" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.773013 4958 scope.go:117] "RemoveContainer" containerID="68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303" Dec 01 11:59:13 crc kubenswrapper[4958]: E1201 11:59:13.773515 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303\": container with ID starting with 68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303 not found: ID does not exist" containerID="68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.773546 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303"} err="failed to get container status \"68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303\": rpc error: code = NotFound desc = could not find container \"68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303\": container with ID starting with 68f18533c8798de44cba9e509bf7af68aefa179c3d1ef7cfb04abc1195be8303 not found: ID does not exist" Dec 01 11:59:13 crc kubenswrapper[4958]: I1201 11:59:13.816924 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" path="/var/lib/kubelet/pods/177eb3e2-df29-49b4-b361-d37f54eac073/volumes" Dec 01 11:59:18 crc kubenswrapper[4958]: I1201 11:59:18.798369 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:59:18 crc kubenswrapper[4958]: E1201 11:59:18.799235 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:59:30 crc kubenswrapper[4958]: I1201 11:59:30.797374 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:59:30 crc kubenswrapper[4958]: E1201 11:59:30.798202 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:59:36 crc kubenswrapper[4958]: I1201 11:59:36.048040 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-s9sqt"] Dec 01 11:59:36 crc kubenswrapper[4958]: I1201 11:59:36.058788 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-s9sqt"] Dec 01 11:59:37 crc kubenswrapper[4958]: I1201 11:59:37.811731 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb73270a-b4ea-4b40-8c82-9e9ceb53a719" path="/var/lib/kubelet/pods/eb73270a-b4ea-4b40-8c82-9e9ceb53a719/volumes" Dec 01 11:59:41 crc kubenswrapper[4958]: I1201 11:59:41.798961 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:59:41 crc kubenswrapper[4958]: E1201 11:59:41.800398 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 11:59:47 crc kubenswrapper[4958]: I1201 11:59:47.036617 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-a906-account-create-kmtzz"] Dec 01 11:59:47 crc kubenswrapper[4958]: I1201 11:59:47.050151 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-a906-account-create-kmtzz"] Dec 01 11:59:47 crc kubenswrapper[4958]: I1201 11:59:47.820239 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d87b363-c828-4b73-8c31-b3dd67ebce08" path="/var/lib/kubelet/pods/1d87b363-c828-4b73-8c31-b3dd67ebce08/volumes" Dec 01 11:59:55 crc kubenswrapper[4958]: I1201 11:59:55.797616 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 11:59:55 crc kubenswrapper[4958]: E1201 11:59:55.798678 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.150316 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j"] Dec 01 12:00:00 crc kubenswrapper[4958]: E1201 12:00:00.151322 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="registry-server" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.151336 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="registry-server" Dec 01 12:00:00 crc kubenswrapper[4958]: E1201 12:00:00.151356 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="extract-utilities" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.151362 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="extract-utilities" Dec 01 12:00:00 crc kubenswrapper[4958]: E1201 12:00:00.151373 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="extract-content" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.151380 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="extract-content" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.151604 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="177eb3e2-df29-49b4-b361-d37f54eac073" containerName="registry-server" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.152546 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.155359 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.155771 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.165268 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j"] Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.255209 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gj57\" (UniqueName: \"kubernetes.io/projected/9cf906b4-61cb-48da-b22b-20f5c88b66a5-kube-api-access-2gj57\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.255586 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cf906b4-61cb-48da-b22b-20f5c88b66a5-config-volume\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.255615 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9cf906b4-61cb-48da-b22b-20f5c88b66a5-secret-volume\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.358218 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cf906b4-61cb-48da-b22b-20f5c88b66a5-config-volume\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.358283 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9cf906b4-61cb-48da-b22b-20f5c88b66a5-secret-volume\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.358382 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gj57\" (UniqueName: \"kubernetes.io/projected/9cf906b4-61cb-48da-b22b-20f5c88b66a5-kube-api-access-2gj57\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.359220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cf906b4-61cb-48da-b22b-20f5c88b66a5-config-volume\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.372151 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9cf906b4-61cb-48da-b22b-20f5c88b66a5-secret-volume\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.377424 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gj57\" (UniqueName: \"kubernetes.io/projected/9cf906b4-61cb-48da-b22b-20f5c88b66a5-kube-api-access-2gj57\") pod \"collect-profiles-29409840-rd54j\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.481894 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:00 crc kubenswrapper[4958]: I1201 12:00:00.956750 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j"] Dec 01 12:00:01 crc kubenswrapper[4958]: I1201 12:00:01.059787 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-xwjqn"] Dec 01 12:00:01 crc kubenswrapper[4958]: I1201 12:00:01.070437 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-xwjqn"] Dec 01 12:00:01 crc kubenswrapper[4958]: I1201 12:00:01.279737 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" event={"ID":"9cf906b4-61cb-48da-b22b-20f5c88b66a5","Type":"ContainerStarted","Data":"f57e4206952138b8cfda8beb9a17ef367c600e721c393af5242c3bca011330fb"} Dec 01 12:00:01 crc kubenswrapper[4958]: I1201 12:00:01.279796 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" event={"ID":"9cf906b4-61cb-48da-b22b-20f5c88b66a5","Type":"ContainerStarted","Data":"40e32fd83a60ff9919eb8c5c8b64c5f92cfb5dffcc37245945295e86783dd5a0"} Dec 01 12:00:01 crc kubenswrapper[4958]: I1201 12:00:01.311439 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" podStartSLOduration=1.311414818 podStartE2EDuration="1.311414818s" podCreationTimestamp="2025-12-01 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:00:01.309103143 +0000 UTC m=+7248.817892190" watchObservedRunningTime="2025-12-01 12:00:01.311414818 +0000 UTC m=+7248.820203855" Dec 01 12:00:01 crc kubenswrapper[4958]: I1201 12:00:01.937067 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5616721a-846c-416c-91cf-9275a529232b" path="/var/lib/kubelet/pods/5616721a-846c-416c-91cf-9275a529232b/volumes" Dec 01 12:00:02 crc kubenswrapper[4958]: I1201 12:00:02.293355 4958 generic.go:334] "Generic (PLEG): container finished" podID="9cf906b4-61cb-48da-b22b-20f5c88b66a5" containerID="f57e4206952138b8cfda8beb9a17ef367c600e721c393af5242c3bca011330fb" exitCode=0 Dec 01 12:00:02 crc kubenswrapper[4958]: I1201 12:00:02.293411 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" event={"ID":"9cf906b4-61cb-48da-b22b-20f5c88b66a5","Type":"ContainerDied","Data":"f57e4206952138b8cfda8beb9a17ef367c600e721c393af5242c3bca011330fb"} Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.690300 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.852828 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9cf906b4-61cb-48da-b22b-20f5c88b66a5-secret-volume\") pod \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.852908 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cf906b4-61cb-48da-b22b-20f5c88b66a5-config-volume\") pod \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.852941 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gj57\" (UniqueName: \"kubernetes.io/projected/9cf906b4-61cb-48da-b22b-20f5c88b66a5-kube-api-access-2gj57\") pod \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\" (UID: \"9cf906b4-61cb-48da-b22b-20f5c88b66a5\") " Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.854052 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf906b4-61cb-48da-b22b-20f5c88b66a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "9cf906b4-61cb-48da-b22b-20f5c88b66a5" (UID: "9cf906b4-61cb-48da-b22b-20f5c88b66a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.858748 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf906b4-61cb-48da-b22b-20f5c88b66a5-kube-api-access-2gj57" (OuterVolumeSpecName: "kube-api-access-2gj57") pod "9cf906b4-61cb-48da-b22b-20f5c88b66a5" (UID: "9cf906b4-61cb-48da-b22b-20f5c88b66a5"). InnerVolumeSpecName "kube-api-access-2gj57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.858783 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf906b4-61cb-48da-b22b-20f5c88b66a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9cf906b4-61cb-48da-b22b-20f5c88b66a5" (UID: "9cf906b4-61cb-48da-b22b-20f5c88b66a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.955792 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9cf906b4-61cb-48da-b22b-20f5c88b66a5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.955862 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9cf906b4-61cb-48da-b22b-20f5c88b66a5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:00:03 crc kubenswrapper[4958]: I1201 12:00:03.955878 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gj57\" (UniqueName: \"kubernetes.io/projected/9cf906b4-61cb-48da-b22b-20f5c88b66a5-kube-api-access-2gj57\") on node \"crc\" DevicePath \"\"" Dec 01 12:00:04 crc kubenswrapper[4958]: I1201 12:00:04.319211 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" event={"ID":"9cf906b4-61cb-48da-b22b-20f5c88b66a5","Type":"ContainerDied","Data":"40e32fd83a60ff9919eb8c5c8b64c5f92cfb5dffcc37245945295e86783dd5a0"} Dec 01 12:00:04 crc kubenswrapper[4958]: I1201 12:00:04.319280 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40e32fd83a60ff9919eb8c5c8b64c5f92cfb5dffcc37245945295e86783dd5a0" Dec 01 12:00:04 crc kubenswrapper[4958]: I1201 12:00:04.319364 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j" Dec 01 12:00:04 crc kubenswrapper[4958]: I1201 12:00:04.392304 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw"] Dec 01 12:00:04 crc kubenswrapper[4958]: I1201 12:00:04.404395 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409795-b44gw"] Dec 01 12:00:05 crc kubenswrapper[4958]: I1201 12:00:05.820119 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033ea980-682d-4f50-911d-25b850724299" path="/var/lib/kubelet/pods/033ea980-682d-4f50-911d-25b850724299/volumes" Dec 01 12:00:07 crc kubenswrapper[4958]: I1201 12:00:07.798771 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:00:07 crc kubenswrapper[4958]: E1201 12:00:07.799607 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:00:19 crc kubenswrapper[4958]: I1201 12:00:19.799686 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:00:19 crc kubenswrapper[4958]: E1201 12:00:19.800802 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:00:29 crc kubenswrapper[4958]: I1201 12:00:29.782219 4958 scope.go:117] "RemoveContainer" containerID="4e28f6fcc54f78af81403bfbb65944317ce64821b656b453531b90863cbfa1a7" Dec 01 12:00:29 crc kubenswrapper[4958]: I1201 12:00:29.833482 4958 scope.go:117] "RemoveContainer" containerID="47452c900b775e2f7df3b71580d8f579f7d88d78ed88bdcd16ce6e99655045a1" Dec 01 12:00:29 crc kubenswrapper[4958]: I1201 12:00:29.906248 4958 scope.go:117] "RemoveContainer" containerID="416e999dd021c69d33b07c2df9d91a44d6d917f05144ed13ca27b0c284c5bd12" Dec 01 12:00:29 crc kubenswrapper[4958]: I1201 12:00:29.961582 4958 scope.go:117] "RemoveContainer" containerID="01445f920b6113475195b247d5fe27db4cec3d4ae1aa583c69d7942bd3fd3f92" Dec 01 12:00:34 crc kubenswrapper[4958]: I1201 12:00:34.798462 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:00:34 crc kubenswrapper[4958]: E1201 12:00:34.799913 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:00:48 crc kubenswrapper[4958]: I1201 12:00:48.798365 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:00:48 crc kubenswrapper[4958]: E1201 12:00:48.799934 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.159396 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409841-b7glg"] Dec 01 12:01:00 crc kubenswrapper[4958]: E1201 12:01:00.160614 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf906b4-61cb-48da-b22b-20f5c88b66a5" containerName="collect-profiles" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.160633 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf906b4-61cb-48da-b22b-20f5c88b66a5" containerName="collect-profiles" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.160909 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf906b4-61cb-48da-b22b-20f5c88b66a5" containerName="collect-profiles" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.162013 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.186970 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409841-b7glg"] Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.233165 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-fernet-keys\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.233423 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-combined-ca-bundle\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.233505 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm77\" (UniqueName: \"kubernetes.io/projected/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-kube-api-access-fmm77\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.233700 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-config-data\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.336894 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-fernet-keys\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.337118 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-combined-ca-bundle\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.337169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm77\" (UniqueName: \"kubernetes.io/projected/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-kube-api-access-fmm77\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.337279 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-config-data\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.348873 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-fernet-keys\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.348892 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-combined-ca-bundle\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.348963 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-config-data\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.365431 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm77\" (UniqueName: \"kubernetes.io/projected/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-kube-api-access-fmm77\") pod \"keystone-cron-29409841-b7glg\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:00 crc kubenswrapper[4958]: I1201 12:01:00.488393 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:01 crc kubenswrapper[4958]: I1201 12:01:01.074684 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409841-b7glg"] Dec 01 12:01:01 crc kubenswrapper[4958]: W1201 12:01:01.080267 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode88a19a5_40db_4ed9_a8e7_73e3ca8d797f.slice/crio-487bc192b6f220b9d4a4310f090a30a984f6d74b88085f38fa12e1b073e4e08c WatchSource:0}: Error finding container 487bc192b6f220b9d4a4310f090a30a984f6d74b88085f38fa12e1b073e4e08c: Status 404 returned error can't find the container with id 487bc192b6f220b9d4a4310f090a30a984f6d74b88085f38fa12e1b073e4e08c Dec 01 12:01:01 crc kubenswrapper[4958]: I1201 12:01:01.192310 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-b7glg" event={"ID":"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f","Type":"ContainerStarted","Data":"487bc192b6f220b9d4a4310f090a30a984f6d74b88085f38fa12e1b073e4e08c"} Dec 01 12:01:01 crc kubenswrapper[4958]: I1201 12:01:01.797740 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:01:01 crc kubenswrapper[4958]: E1201 12:01:01.798135 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:01:02 crc kubenswrapper[4958]: I1201 12:01:02.207105 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-b7glg" event={"ID":"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f","Type":"ContainerStarted","Data":"7a34a0ba87121eb95766f476937af832f4f04032c3b6098407b0c0b511fc3996"} Dec 01 12:01:02 crc kubenswrapper[4958]: I1201 12:01:02.239885 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409841-b7glg" podStartSLOduration=2.239828644 podStartE2EDuration="2.239828644s" podCreationTimestamp="2025-12-01 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:01:02.236389267 +0000 UTC m=+7309.745178344" watchObservedRunningTime="2025-12-01 12:01:02.239828644 +0000 UTC m=+7309.748617701" Dec 01 12:01:04 crc kubenswrapper[4958]: I1201 12:01:04.238806 4958 generic.go:334] "Generic (PLEG): container finished" podID="e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" containerID="7a34a0ba87121eb95766f476937af832f4f04032c3b6098407b0c0b511fc3996" exitCode=0 Dec 01 12:01:04 crc kubenswrapper[4958]: I1201 12:01:04.238960 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-b7glg" event={"ID":"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f","Type":"ContainerDied","Data":"7a34a0ba87121eb95766f476937af832f4f04032c3b6098407b0c0b511fc3996"} Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.696589 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.810805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-fernet-keys\") pod \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.810872 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-combined-ca-bundle\") pod \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.810911 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-config-data\") pod \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.810962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmm77\" (UniqueName: \"kubernetes.io/projected/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-kube-api-access-fmm77\") pod \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\" (UID: \"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f\") " Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.817111 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-kube-api-access-fmm77" (OuterVolumeSpecName: "kube-api-access-fmm77") pod "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" (UID: "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f"). InnerVolumeSpecName "kube-api-access-fmm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.818964 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" (UID: "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.845409 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" (UID: "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.884054 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-config-data" (OuterVolumeSpecName: "config-data") pod "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" (UID: "e88a19a5-40db-4ed9-a8e7-73e3ca8d797f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.913128 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmm77\" (UniqueName: \"kubernetes.io/projected/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-kube-api-access-fmm77\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.913160 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.913169 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:05 crc kubenswrapper[4958]: I1201 12:01:05.913177 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e88a19a5-40db-4ed9-a8e7-73e3ca8d797f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:01:06 crc kubenswrapper[4958]: I1201 12:01:06.267316 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409841-b7glg" event={"ID":"e88a19a5-40db-4ed9-a8e7-73e3ca8d797f","Type":"ContainerDied","Data":"487bc192b6f220b9d4a4310f090a30a984f6d74b88085f38fa12e1b073e4e08c"} Dec 01 12:01:06 crc kubenswrapper[4958]: I1201 12:01:06.267642 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="487bc192b6f220b9d4a4310f090a30a984f6d74b88085f38fa12e1b073e4e08c" Dec 01 12:01:06 crc kubenswrapper[4958]: I1201 12:01:06.267410 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409841-b7glg" Dec 01 12:01:14 crc kubenswrapper[4958]: I1201 12:01:14.797824 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:01:14 crc kubenswrapper[4958]: E1201 12:01:14.798927 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:01:26 crc kubenswrapper[4958]: I1201 12:01:26.797779 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:01:26 crc kubenswrapper[4958]: E1201 12:01:26.799151 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:01:30 crc kubenswrapper[4958]: I1201 12:01:30.109488 4958 scope.go:117] "RemoveContainer" containerID="e5a8b86ff66a67e65e759480ef22ade57649c83e90230daf79d1977feeac66bb" Dec 01 12:01:30 crc kubenswrapper[4958]: I1201 12:01:30.140429 4958 scope.go:117] "RemoveContainer" containerID="a7c46410c0961d7b1bcaddccabfc3521fe012f7b4dff73135a2ec123fcc58dab" Dec 01 12:01:38 crc kubenswrapper[4958]: I1201 12:01:38.797762 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:01:38 crc kubenswrapper[4958]: E1201 12:01:38.798535 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:01:50 crc kubenswrapper[4958]: I1201 12:01:50.798370 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:01:50 crc kubenswrapper[4958]: E1201 12:01:50.799399 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:02:04 crc kubenswrapper[4958]: I1201 12:02:04.798385 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:02:04 crc kubenswrapper[4958]: E1201 12:02:04.799718 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.304008 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pkk85"] Dec 01 12:02:10 crc kubenswrapper[4958]: E1201 12:02:10.304960 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" containerName="keystone-cron" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.304975 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" containerName="keystone-cron" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.305211 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e88a19a5-40db-4ed9-a8e7-73e3ca8d797f" containerName="keystone-cron" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.306826 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.316384 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pkk85"] Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.372608 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48dh\" (UniqueName: \"kubernetes.io/projected/2a05e35d-8d46-4672-80bb-60d13501836e-kube-api-access-t48dh\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.372679 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-utilities\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.372764 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-catalog-content\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.475057 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-catalog-content\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.475228 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48dh\" (UniqueName: \"kubernetes.io/projected/2a05e35d-8d46-4672-80bb-60d13501836e-kube-api-access-t48dh\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.475270 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-utilities\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.475783 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-catalog-content\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.475806 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-utilities\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.496647 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48dh\" (UniqueName: \"kubernetes.io/projected/2a05e35d-8d46-4672-80bb-60d13501836e-kube-api-access-t48dh\") pod \"community-operators-pkk85\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:10 crc kubenswrapper[4958]: I1201 12:02:10.627820 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:11 crc kubenswrapper[4958]: I1201 12:02:11.178309 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pkk85"] Dec 01 12:02:12 crc kubenswrapper[4958]: I1201 12:02:12.169037 4958 generic.go:334] "Generic (PLEG): container finished" podID="2a05e35d-8d46-4672-80bb-60d13501836e" containerID="c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500" exitCode=0 Dec 01 12:02:12 crc kubenswrapper[4958]: I1201 12:02:12.169130 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerDied","Data":"c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500"} Dec 01 12:02:12 crc kubenswrapper[4958]: I1201 12:02:12.169317 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerStarted","Data":"061bde6a54ef26580145dffb4a2af61e82f11160ce006b6dc98993b513bdb474"} Dec 01 12:02:12 crc kubenswrapper[4958]: I1201 12:02:12.172105 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:02:14 crc kubenswrapper[4958]: I1201 12:02:14.200518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerStarted","Data":"eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b"} Dec 01 12:02:15 crc kubenswrapper[4958]: I1201 12:02:15.213229 4958 generic.go:334] "Generic (PLEG): container finished" podID="2a05e35d-8d46-4672-80bb-60d13501836e" containerID="eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b" exitCode=0 Dec 01 12:02:15 crc kubenswrapper[4958]: I1201 12:02:15.213497 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerDied","Data":"eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b"} Dec 01 12:02:15 crc kubenswrapper[4958]: I1201 12:02:15.801508 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:02:15 crc kubenswrapper[4958]: E1201 12:02:15.802132 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:02:16 crc kubenswrapper[4958]: I1201 12:02:16.090741 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-kl977"] Dec 01 12:02:16 crc kubenswrapper[4958]: I1201 12:02:16.101459 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-kl977"] Dec 01 12:02:17 crc kubenswrapper[4958]: I1201 12:02:17.239091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerStarted","Data":"1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae"} Dec 01 12:02:17 crc kubenswrapper[4958]: I1201 12:02:17.268172 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pkk85" podStartSLOduration=3.336362468 podStartE2EDuration="7.268150163s" podCreationTimestamp="2025-12-01 12:02:10 +0000 UTC" firstStartedPulling="2025-12-01 12:02:12.171728233 +0000 UTC m=+7379.680517280" lastFinishedPulling="2025-12-01 12:02:16.103515898 +0000 UTC m=+7383.612304975" observedRunningTime="2025-12-01 12:02:17.258826479 +0000 UTC m=+7384.767615526" watchObservedRunningTime="2025-12-01 12:02:17.268150163 +0000 UTC m=+7384.776939210" Dec 01 12:02:17 crc kubenswrapper[4958]: I1201 12:02:17.818348 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f5b51b-bc8c-472a-84ee-529d091dcd61" path="/var/lib/kubelet/pods/d5f5b51b-bc8c-472a-84ee-529d091dcd61/volumes" Dec 01 12:02:20 crc kubenswrapper[4958]: I1201 12:02:20.630794 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:20 crc kubenswrapper[4958]: I1201 12:02:20.631414 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:20 crc kubenswrapper[4958]: I1201 12:02:20.713615 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:21 crc kubenswrapper[4958]: I1201 12:02:21.436454 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:21 crc kubenswrapper[4958]: I1201 12:02:21.505716 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pkk85"] Dec 01 12:02:23 crc kubenswrapper[4958]: I1201 12:02:23.385762 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pkk85" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="registry-server" containerID="cri-o://1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae" gracePeriod=2 Dec 01 12:02:23 crc kubenswrapper[4958]: I1201 12:02:23.906837 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.032394 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-utilities\") pod \"2a05e35d-8d46-4672-80bb-60d13501836e\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.032705 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-catalog-content\") pod \"2a05e35d-8d46-4672-80bb-60d13501836e\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.032753 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t48dh\" (UniqueName: \"kubernetes.io/projected/2a05e35d-8d46-4672-80bb-60d13501836e-kube-api-access-t48dh\") pod \"2a05e35d-8d46-4672-80bb-60d13501836e\" (UID: \"2a05e35d-8d46-4672-80bb-60d13501836e\") " Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.033315 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-utilities" (OuterVolumeSpecName: "utilities") pod "2a05e35d-8d46-4672-80bb-60d13501836e" (UID: "2a05e35d-8d46-4672-80bb-60d13501836e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.046201 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a05e35d-8d46-4672-80bb-60d13501836e-kube-api-access-t48dh" (OuterVolumeSpecName: "kube-api-access-t48dh") pod "2a05e35d-8d46-4672-80bb-60d13501836e" (UID: "2a05e35d-8d46-4672-80bb-60d13501836e"). InnerVolumeSpecName "kube-api-access-t48dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.093944 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a05e35d-8d46-4672-80bb-60d13501836e" (UID: "2a05e35d-8d46-4672-80bb-60d13501836e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.135558 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.135598 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t48dh\" (UniqueName: \"kubernetes.io/projected/2a05e35d-8d46-4672-80bb-60d13501836e-kube-api-access-t48dh\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.135612 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a05e35d-8d46-4672-80bb-60d13501836e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.398919 4958 generic.go:334] "Generic (PLEG): container finished" podID="2a05e35d-8d46-4672-80bb-60d13501836e" containerID="1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae" exitCode=0 Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.399052 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerDied","Data":"1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae"} Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.399467 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pkk85" event={"ID":"2a05e35d-8d46-4672-80bb-60d13501836e","Type":"ContainerDied","Data":"061bde6a54ef26580145dffb4a2af61e82f11160ce006b6dc98993b513bdb474"} Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.399497 4958 scope.go:117] "RemoveContainer" containerID="1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.399122 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pkk85" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.432808 4958 scope.go:117] "RemoveContainer" containerID="eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.451373 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pkk85"] Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.467539 4958 scope.go:117] "RemoveContainer" containerID="c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.469985 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pkk85"] Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.518477 4958 scope.go:117] "RemoveContainer" containerID="1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae" Dec 01 12:02:24 crc kubenswrapper[4958]: E1201 12:02:24.518981 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae\": container with ID starting with 1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae not found: ID does not exist" containerID="1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.519042 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae"} err="failed to get container status \"1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae\": rpc error: code = NotFound desc = could not find container \"1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae\": container with ID starting with 1eff9a3b6fe337b8bacbd003ba5485bb71c0a78a8169ca60085a843d8bcd79ae not found: ID does not exist" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.519107 4958 scope.go:117] "RemoveContainer" containerID="eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b" Dec 01 12:02:24 crc kubenswrapper[4958]: E1201 12:02:24.519543 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b\": container with ID starting with eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b not found: ID does not exist" containerID="eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.519615 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b"} err="failed to get container status \"eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b\": rpc error: code = NotFound desc = could not find container \"eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b\": container with ID starting with eb95824a8508a9f41b5c8285a63fb267242ed38ca8fa493f47ac6063c62e827b not found: ID does not exist" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.519668 4958 scope.go:117] "RemoveContainer" containerID="c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500" Dec 01 12:02:24 crc kubenswrapper[4958]: E1201 12:02:24.520160 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500\": container with ID starting with c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500 not found: ID does not exist" containerID="c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500" Dec 01 12:02:24 crc kubenswrapper[4958]: I1201 12:02:24.520215 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500"} err="failed to get container status \"c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500\": rpc error: code = NotFound desc = could not find container \"c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500\": container with ID starting with c911423ef86e85e6bed6df7db93c61a36a16195ba6523fda7428adb91779b500 not found: ID does not exist" Dec 01 12:02:25 crc kubenswrapper[4958]: I1201 12:02:25.817029 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" path="/var/lib/kubelet/pods/2a05e35d-8d46-4672-80bb-60d13501836e/volumes" Dec 01 12:02:26 crc kubenswrapper[4958]: I1201 12:02:26.078762 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-a18f-account-create-vd96s"] Dec 01 12:02:26 crc kubenswrapper[4958]: I1201 12:02:26.100815 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-a18f-account-create-vd96s"] Dec 01 12:02:27 crc kubenswrapper[4958]: I1201 12:02:27.819911 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0286bf3f-7cae-4b16-8be5-d36b4e94198f" path="/var/lib/kubelet/pods/0286bf3f-7cae-4b16-8be5-d36b4e94198f/volumes" Dec 01 12:02:28 crc kubenswrapper[4958]: I1201 12:02:28.798621 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:02:29 crc kubenswrapper[4958]: I1201 12:02:29.466204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"96dc7e42045ee9316b4e012dfce125a32a47cbfca9463ecd54f2bc641ba99e3c"} Dec 01 12:02:30 crc kubenswrapper[4958]: I1201 12:02:30.208598 4958 scope.go:117] "RemoveContainer" containerID="3ae44865d4f762ce64b42289c02331119f04af0c07027618ad2e239b4e497ce2" Dec 01 12:02:30 crc kubenswrapper[4958]: I1201 12:02:30.358726 4958 scope.go:117] "RemoveContainer" containerID="e5d08fbd2d3e26cb17444ed39d88772fbc97f46e6e3178f1f0602378251b0548" Dec 01 12:02:38 crc kubenswrapper[4958]: I1201 12:02:38.039999 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-p99r6"] Dec 01 12:02:38 crc kubenswrapper[4958]: I1201 12:02:38.049394 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-p99r6"] Dec 01 12:02:39 crc kubenswrapper[4958]: I1201 12:02:39.815266 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f466c4e0-ff69-4257-869c-e83acba8328e" path="/var/lib/kubelet/pods/f466c4e0-ff69-4257-869c-e83acba8328e/volumes" Dec 01 12:03:00 crc kubenswrapper[4958]: I1201 12:03:00.072121 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-nxk9p"] Dec 01 12:03:00 crc kubenswrapper[4958]: I1201 12:03:00.098290 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-nxk9p"] Dec 01 12:03:01 crc kubenswrapper[4958]: I1201 12:03:01.814931 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cce1d1-fae6-4fa7-89ec-1b47571a9933" path="/var/lib/kubelet/pods/53cce1d1-fae6-4fa7-89ec-1b47571a9933/volumes" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.108961 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v75wc"] Dec 01 12:03:03 crc kubenswrapper[4958]: E1201 12:03:03.116013 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="registry-server" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.116071 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="registry-server" Dec 01 12:03:03 crc kubenswrapper[4958]: E1201 12:03:03.116144 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="extract-content" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.116157 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="extract-content" Dec 01 12:03:03 crc kubenswrapper[4958]: E1201 12:03:03.116285 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="extract-utilities" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.116300 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="extract-utilities" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.116922 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a05e35d-8d46-4672-80bb-60d13501836e" containerName="registry-server" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.119780 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.128749 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v75wc"] Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.211154 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-catalog-content\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.211276 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-utilities\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.211301 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ck9f\" (UniqueName: \"kubernetes.io/projected/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-kube-api-access-6ck9f\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.313721 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-utilities\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.313780 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ck9f\" (UniqueName: \"kubernetes.io/projected/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-kube-api-access-6ck9f\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.314028 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-catalog-content\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.314390 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-utilities\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.315135 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-catalog-content\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.334596 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ck9f\" (UniqueName: \"kubernetes.io/projected/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-kube-api-access-6ck9f\") pod \"certified-operators-v75wc\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:03 crc kubenswrapper[4958]: I1201 12:03:03.458452 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:04 crc kubenswrapper[4958]: I1201 12:03:04.057028 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v75wc"] Dec 01 12:03:04 crc kubenswrapper[4958]: W1201 12:03:04.058021 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01cd9f10_7a3a_44ad_a771_76aa3bf6394a.slice/crio-ce9ac98f859e6aa26ddcd8266af2a70e9eb4fc8f1d4dd650e9d74b85924ba8ee WatchSource:0}: Error finding container ce9ac98f859e6aa26ddcd8266af2a70e9eb4fc8f1d4dd650e9d74b85924ba8ee: Status 404 returned error can't find the container with id ce9ac98f859e6aa26ddcd8266af2a70e9eb4fc8f1d4dd650e9d74b85924ba8ee Dec 01 12:03:04 crc kubenswrapper[4958]: I1201 12:03:04.918629 4958 generic.go:334] "Generic (PLEG): container finished" podID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerID="06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2" exitCode=0 Dec 01 12:03:04 crc kubenswrapper[4958]: I1201 12:03:04.918738 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerDied","Data":"06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2"} Dec 01 12:03:04 crc kubenswrapper[4958]: I1201 12:03:04.919192 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerStarted","Data":"ce9ac98f859e6aa26ddcd8266af2a70e9eb4fc8f1d4dd650e9d74b85924ba8ee"} Dec 01 12:03:06 crc kubenswrapper[4958]: I1201 12:03:06.949635 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerStarted","Data":"417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031"} Dec 01 12:03:07 crc kubenswrapper[4958]: I1201 12:03:07.961774 4958 generic.go:334] "Generic (PLEG): container finished" podID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerID="417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031" exitCode=0 Dec 01 12:03:07 crc kubenswrapper[4958]: I1201 12:03:07.961864 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerDied","Data":"417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031"} Dec 01 12:03:09 crc kubenswrapper[4958]: I1201 12:03:08.989105 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerStarted","Data":"8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860"} Dec 01 12:03:09 crc kubenswrapper[4958]: I1201 12:03:09.034250 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v75wc" podStartSLOduration=2.517164976 podStartE2EDuration="6.034230477s" podCreationTimestamp="2025-12-01 12:03:03 +0000 UTC" firstStartedPulling="2025-12-01 12:03:04.923322684 +0000 UTC m=+7432.432111751" lastFinishedPulling="2025-12-01 12:03:08.440388215 +0000 UTC m=+7435.949177252" observedRunningTime="2025-12-01 12:03:09.024922464 +0000 UTC m=+7436.533711521" watchObservedRunningTime="2025-12-01 12:03:09.034230477 +0000 UTC m=+7436.543019524" Dec 01 12:03:10 crc kubenswrapper[4958]: I1201 12:03:10.039648 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-bc9b-account-create-fjkr8"] Dec 01 12:03:10 crc kubenswrapper[4958]: I1201 12:03:10.051213 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-bc9b-account-create-fjkr8"] Dec 01 12:03:11 crc kubenswrapper[4958]: I1201 12:03:11.813612 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779" path="/var/lib/kubelet/pods/a1d8f3dc-cd11-4f1e-8b62-9e8ad0f92779/volumes" Dec 01 12:03:13 crc kubenswrapper[4958]: I1201 12:03:13.458674 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:13 crc kubenswrapper[4958]: I1201 12:03:13.459057 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:13 crc kubenswrapper[4958]: I1201 12:03:13.516896 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:14 crc kubenswrapper[4958]: I1201 12:03:14.121385 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:14 crc kubenswrapper[4958]: I1201 12:03:14.395833 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v75wc"] Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.087153 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v75wc" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="registry-server" containerID="cri-o://8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860" gracePeriod=2 Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.680770 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.849536 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ck9f\" (UniqueName: \"kubernetes.io/projected/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-kube-api-access-6ck9f\") pod \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.849749 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-utilities\") pod \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.850104 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-catalog-content\") pod \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\" (UID: \"01cd9f10-7a3a-44ad-a771-76aa3bf6394a\") " Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.851551 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-utilities" (OuterVolumeSpecName: "utilities") pod "01cd9f10-7a3a-44ad-a771-76aa3bf6394a" (UID: "01cd9f10-7a3a-44ad-a771-76aa3bf6394a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.853621 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.859562 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-kube-api-access-6ck9f" (OuterVolumeSpecName: "kube-api-access-6ck9f") pod "01cd9f10-7a3a-44ad-a771-76aa3bf6394a" (UID: "01cd9f10-7a3a-44ad-a771-76aa3bf6394a"). InnerVolumeSpecName "kube-api-access-6ck9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.947936 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01cd9f10-7a3a-44ad-a771-76aa3bf6394a" (UID: "01cd9f10-7a3a-44ad-a771-76aa3bf6394a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.957787 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:03:16 crc kubenswrapper[4958]: I1201 12:03:16.957865 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ck9f\" (UniqueName: \"kubernetes.io/projected/01cd9f10-7a3a-44ad-a771-76aa3bf6394a-kube-api-access-6ck9f\") on node \"crc\" DevicePath \"\"" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.108998 4958 generic.go:334] "Generic (PLEG): container finished" podID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerID="8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860" exitCode=0 Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.109059 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerDied","Data":"8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860"} Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.109096 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75wc" event={"ID":"01cd9f10-7a3a-44ad-a771-76aa3bf6394a","Type":"ContainerDied","Data":"ce9ac98f859e6aa26ddcd8266af2a70e9eb4fc8f1d4dd650e9d74b85924ba8ee"} Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.109117 4958 scope.go:117] "RemoveContainer" containerID="8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.109112 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75wc" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.156273 4958 scope.go:117] "RemoveContainer" containerID="417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.175062 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v75wc"] Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.207526 4958 scope.go:117] "RemoveContainer" containerID="06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.307521 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v75wc"] Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.340610 4958 scope.go:117] "RemoveContainer" containerID="8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860" Dec 01 12:03:17 crc kubenswrapper[4958]: E1201 12:03:17.341179 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860\": container with ID starting with 8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860 not found: ID does not exist" containerID="8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.341267 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860"} err="failed to get container status \"8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860\": rpc error: code = NotFound desc = could not find container \"8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860\": container with ID starting with 8a27d5e670d5e73cefbed2abba4ad313156466aa6fa23fa9e2b21d4d5ccd9860 not found: ID does not exist" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.341299 4958 scope.go:117] "RemoveContainer" containerID="417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031" Dec 01 12:03:17 crc kubenswrapper[4958]: E1201 12:03:17.341762 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031\": container with ID starting with 417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031 not found: ID does not exist" containerID="417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.341793 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031"} err="failed to get container status \"417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031\": rpc error: code = NotFound desc = could not find container \"417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031\": container with ID starting with 417a7a37d57f75a3c123769a424b7f2ce7d847047a49aed6e9db57d5cedcd031 not found: ID does not exist" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.341817 4958 scope.go:117] "RemoveContainer" containerID="06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2" Dec 01 12:03:17 crc kubenswrapper[4958]: E1201 12:03:17.342238 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2\": container with ID starting with 06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2 not found: ID does not exist" containerID="06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.342294 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2"} err="failed to get container status \"06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2\": rpc error: code = NotFound desc = could not find container \"06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2\": container with ID starting with 06066b133e7183675a4b2eff46188fee0f6984fc4f530383c079f5dbf1f82da2 not found: ID does not exist" Dec 01 12:03:17 crc kubenswrapper[4958]: I1201 12:03:17.820308 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" path="/var/lib/kubelet/pods/01cd9f10-7a3a-44ad-a771-76aa3bf6394a/volumes" Dec 01 12:03:24 crc kubenswrapper[4958]: I1201 12:03:24.067939 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-2fzfc"] Dec 01 12:03:24 crc kubenswrapper[4958]: I1201 12:03:24.091880 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-2fzfc"] Dec 01 12:03:25 crc kubenswrapper[4958]: I1201 12:03:25.813568 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45598b35-f5b0-49b6-95e1-cae26b23a10c" path="/var/lib/kubelet/pods/45598b35-f5b0-49b6-95e1-cae26b23a10c/volumes" Dec 01 12:03:30 crc kubenswrapper[4958]: I1201 12:03:30.478294 4958 scope.go:117] "RemoveContainer" containerID="d2ef83e6d1751ccfa6ab221b3afdde7b734bc1ee8b4c2e0ac473c0787004c47d" Dec 01 12:03:30 crc kubenswrapper[4958]: I1201 12:03:30.746573 4958 scope.go:117] "RemoveContainer" containerID="3d48a1f636b4565c60ce545421ad662e2f30a907ee3ff71fb35e7864478bf38f" Dec 01 12:03:30 crc kubenswrapper[4958]: I1201 12:03:30.809984 4958 scope.go:117] "RemoveContainer" containerID="8899171db4db2e26e00e6d359ac2f8d8cfb881c6619d2c718a8c0ee53064de09" Dec 01 12:03:30 crc kubenswrapper[4958]: I1201 12:03:30.850537 4958 scope.go:117] "RemoveContainer" containerID="076e1c2d2b5cc2628c6ab726d2ac8328390e35f950db36dacc3ca8812b090c8d" Dec 01 12:04:58 crc kubenswrapper[4958]: I1201 12:04:58.210433 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:04:58 crc kubenswrapper[4958]: I1201 12:04:58.211342 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:05:28 crc kubenswrapper[4958]: I1201 12:05:28.211017 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:05:28 crc kubenswrapper[4958]: I1201 12:05:28.211677 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:05:58 crc kubenswrapper[4958]: I1201 12:05:58.210243 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:05:58 crc kubenswrapper[4958]: I1201 12:05:58.210808 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:05:58 crc kubenswrapper[4958]: I1201 12:05:58.210888 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:05:58 crc kubenswrapper[4958]: I1201 12:05:58.212208 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96dc7e42045ee9316b4e012dfce125a32a47cbfca9463ecd54f2bc641ba99e3c"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:05:58 crc kubenswrapper[4958]: I1201 12:05:58.212278 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://96dc7e42045ee9316b4e012dfce125a32a47cbfca9463ecd54f2bc641ba99e3c" gracePeriod=600 Dec 01 12:05:59 crc kubenswrapper[4958]: I1201 12:05:59.120616 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="96dc7e42045ee9316b4e012dfce125a32a47cbfca9463ecd54f2bc641ba99e3c" exitCode=0 Dec 01 12:05:59 crc kubenswrapper[4958]: I1201 12:05:59.120859 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"96dc7e42045ee9316b4e012dfce125a32a47cbfca9463ecd54f2bc641ba99e3c"} Dec 01 12:05:59 crc kubenswrapper[4958]: I1201 12:05:59.121215 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46"} Dec 01 12:05:59 crc kubenswrapper[4958]: I1201 12:05:59.121235 4958 scope.go:117] "RemoveContainer" containerID="067840fb2d6bfb95377c2a03d80ed3debdc7773fec4c9834afb5c06e12a4963d" Dec 01 12:06:28 crc kubenswrapper[4958]: I1201 12:06:28.503334 4958 generic.go:334] "Generic (PLEG): container finished" podID="47a028df-7124-41b2-b3ed-0f25905f265b" containerID="d44e358469cdda21b1bb3c1db1cbacbe78dd96a5803ecc79711af392534fec35" exitCode=0 Dec 01 12:06:28 crc kubenswrapper[4958]: I1201 12:06:28.503484 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" event={"ID":"47a028df-7124-41b2-b3ed-0f25905f265b","Type":"ContainerDied","Data":"d44e358469cdda21b1bb3c1db1cbacbe78dd96a5803ecc79711af392534fec35"} Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.082101 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.144916 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvl88\" (UniqueName: \"kubernetes.io/projected/47a028df-7124-41b2-b3ed-0f25905f265b-kube-api-access-jvl88\") pod \"47a028df-7124-41b2-b3ed-0f25905f265b\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.145045 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-inventory\") pod \"47a028df-7124-41b2-b3ed-0f25905f265b\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.145074 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ceph\") pod \"47a028df-7124-41b2-b3ed-0f25905f265b\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.145113 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-tripleo-cleanup-combined-ca-bundle\") pod \"47a028df-7124-41b2-b3ed-0f25905f265b\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.145174 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ssh-key\") pod \"47a028df-7124-41b2-b3ed-0f25905f265b\" (UID: \"47a028df-7124-41b2-b3ed-0f25905f265b\") " Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.152165 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a028df-7124-41b2-b3ed-0f25905f265b-kube-api-access-jvl88" (OuterVolumeSpecName: "kube-api-access-jvl88") pod "47a028df-7124-41b2-b3ed-0f25905f265b" (UID: "47a028df-7124-41b2-b3ed-0f25905f265b"). InnerVolumeSpecName "kube-api-access-jvl88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.152539 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ceph" (OuterVolumeSpecName: "ceph") pod "47a028df-7124-41b2-b3ed-0f25905f265b" (UID: "47a028df-7124-41b2-b3ed-0f25905f265b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.155017 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "47a028df-7124-41b2-b3ed-0f25905f265b" (UID: "47a028df-7124-41b2-b3ed-0f25905f265b"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.184189 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-inventory" (OuterVolumeSpecName: "inventory") pod "47a028df-7124-41b2-b3ed-0f25905f265b" (UID: "47a028df-7124-41b2-b3ed-0f25905f265b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.185585 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "47a028df-7124-41b2-b3ed-0f25905f265b" (UID: "47a028df-7124-41b2-b3ed-0f25905f265b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.247632 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvl88\" (UniqueName: \"kubernetes.io/projected/47a028df-7124-41b2-b3ed-0f25905f265b-kube-api-access-jvl88\") on node \"crc\" DevicePath \"\"" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.247676 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.247690 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.247703 4958 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.247717 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47a028df-7124-41b2-b3ed-0f25905f265b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.528903 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" event={"ID":"47a028df-7124-41b2-b3ed-0f25905f265b","Type":"ContainerDied","Data":"cdefc12d4116665e38d6edd33a61ae8cde911151e4ce70d39996ee28a05c7ac7"} Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.528969 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdefc12d4116665e38d6edd33a61ae8cde911151e4ce70d39996ee28a05c7ac7" Dec 01 12:06:30 crc kubenswrapper[4958]: I1201 12:06:30.529017 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.359092 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-pw7rf"] Dec 01 12:06:41 crc kubenswrapper[4958]: E1201 12:06:41.360070 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="extract-utilities" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.360084 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="extract-utilities" Dec 01 12:06:41 crc kubenswrapper[4958]: E1201 12:06:41.360101 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a028df-7124-41b2-b3ed-0f25905f265b" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.360110 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a028df-7124-41b2-b3ed-0f25905f265b" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 01 12:06:41 crc kubenswrapper[4958]: E1201 12:06:41.360133 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="extract-content" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.360140 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="extract-content" Dec 01 12:06:41 crc kubenswrapper[4958]: E1201 12:06:41.360158 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="registry-server" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.360164 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="registry-server" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.360380 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cd9f10-7a3a-44ad-a771-76aa3bf6394a" containerName="registry-server" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.360398 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a028df-7124-41b2-b3ed-0f25905f265b" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.361290 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.364949 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.364991 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.365615 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.365632 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.374254 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-pw7rf"] Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.436631 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-inventory\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.436721 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdx9t\" (UniqueName: \"kubernetes.io/projected/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-kube-api-access-jdx9t\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.436997 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.437313 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.437823 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ceph\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.539669 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.539764 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.539918 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ceph\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.539971 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-inventory\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.540001 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdx9t\" (UniqueName: \"kubernetes.io/projected/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-kube-api-access-jdx9t\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.545802 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.545990 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-inventory\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.548580 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.554412 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ceph\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.556364 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdx9t\" (UniqueName: \"kubernetes.io/projected/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-kube-api-access-jdx9t\") pod \"bootstrap-openstack-openstack-cell1-pw7rf\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:41 crc kubenswrapper[4958]: I1201 12:06:41.689360 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:06:42 crc kubenswrapper[4958]: I1201 12:06:42.347152 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-pw7rf"] Dec 01 12:06:42 crc kubenswrapper[4958]: I1201 12:06:42.709910 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" event={"ID":"f6d924a1-e8ba-427c-bcd0-7f171676c2d2","Type":"ContainerStarted","Data":"adec0ca7893d2cd0db5c3e3ded31b9a755439b5bacb9d7fa1f3f77dabb1d31f8"} Dec 01 12:06:43 crc kubenswrapper[4958]: I1201 12:06:43.721203 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" event={"ID":"f6d924a1-e8ba-427c-bcd0-7f171676c2d2","Type":"ContainerStarted","Data":"c93a2fdbe310423479688d5b2d5cd38d5c916469e26ba549ef1dac5911da092f"} Dec 01 12:07:58 crc kubenswrapper[4958]: I1201 12:07:58.210420 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:07:58 crc kubenswrapper[4958]: I1201 12:07:58.211161 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:08:28 crc kubenswrapper[4958]: I1201 12:08:28.210759 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:08:28 crc kubenswrapper[4958]: I1201 12:08:28.211475 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.460536 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" podStartSLOduration=122.262539787 podStartE2EDuration="2m2.460518229s" podCreationTimestamp="2025-12-01 12:06:41 +0000 UTC" firstStartedPulling="2025-12-01 12:06:42.355260636 +0000 UTC m=+7649.864049673" lastFinishedPulling="2025-12-01 12:06:42.553239078 +0000 UTC m=+7650.062028115" observedRunningTime="2025-12-01 12:06:43.749315762 +0000 UTC m=+7651.258104799" watchObservedRunningTime="2025-12-01 12:08:43.460518229 +0000 UTC m=+7770.969307266" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.467888 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l5mf6"] Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.474356 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.481496 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5mf6"] Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.502057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8lpq\" (UniqueName: \"kubernetes.io/projected/e32b0425-6c91-4a4f-9e93-541888363a16-kube-api-access-q8lpq\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.502208 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-utilities\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.502268 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-catalog-content\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.603752 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8lpq\" (UniqueName: \"kubernetes.io/projected/e32b0425-6c91-4a4f-9e93-541888363a16-kube-api-access-q8lpq\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.603929 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-utilities\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.603983 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-catalog-content\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.604479 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-catalog-content\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.604729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-utilities\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.628908 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8lpq\" (UniqueName: \"kubernetes.io/projected/e32b0425-6c91-4a4f-9e93-541888363a16-kube-api-access-q8lpq\") pod \"redhat-marketplace-l5mf6\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:43 crc kubenswrapper[4958]: I1201 12:08:43.795851 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:44 crc kubenswrapper[4958]: W1201 12:08:44.374047 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32b0425_6c91_4a4f_9e93_541888363a16.slice/crio-1b4efbf7382263f26487010f3b28ce2dd58d15831b6e5116adacd60d45f99665 WatchSource:0}: Error finding container 1b4efbf7382263f26487010f3b28ce2dd58d15831b6e5116adacd60d45f99665: Status 404 returned error can't find the container with id 1b4efbf7382263f26487010f3b28ce2dd58d15831b6e5116adacd60d45f99665 Dec 01 12:08:44 crc kubenswrapper[4958]: I1201 12:08:44.374769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5mf6"] Dec 01 12:08:44 crc kubenswrapper[4958]: I1201 12:08:44.656670 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerStarted","Data":"43575940ae3a0925939b6a7102ee1a134d7b89d18dc0e26363fcbd951d942d91"} Dec 01 12:08:44 crc kubenswrapper[4958]: I1201 12:08:44.656948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerStarted","Data":"1b4efbf7382263f26487010f3b28ce2dd58d15831b6e5116adacd60d45f99665"} Dec 01 12:08:45 crc kubenswrapper[4958]: I1201 12:08:45.697274 4958 generic.go:334] "Generic (PLEG): container finished" podID="e32b0425-6c91-4a4f-9e93-541888363a16" containerID="43575940ae3a0925939b6a7102ee1a134d7b89d18dc0e26363fcbd951d942d91" exitCode=0 Dec 01 12:08:45 crc kubenswrapper[4958]: I1201 12:08:45.697643 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerDied","Data":"43575940ae3a0925939b6a7102ee1a134d7b89d18dc0e26363fcbd951d942d91"} Dec 01 12:08:45 crc kubenswrapper[4958]: I1201 12:08:45.712262 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:08:46 crc kubenswrapper[4958]: I1201 12:08:46.727572 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerStarted","Data":"ed64081557db405409812af81b345d37b88e80d703611c4bbc9d99326cf67227"} Dec 01 12:08:47 crc kubenswrapper[4958]: I1201 12:08:47.752272 4958 generic.go:334] "Generic (PLEG): container finished" podID="e32b0425-6c91-4a4f-9e93-541888363a16" containerID="ed64081557db405409812af81b345d37b88e80d703611c4bbc9d99326cf67227" exitCode=0 Dec 01 12:08:47 crc kubenswrapper[4958]: I1201 12:08:47.752363 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerDied","Data":"ed64081557db405409812af81b345d37b88e80d703611c4bbc9d99326cf67227"} Dec 01 12:08:48 crc kubenswrapper[4958]: I1201 12:08:48.768971 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerStarted","Data":"46d1e900f3d86f7ce8a873e3196ab659190906df505dfeb08eb0f1dd178b4159"} Dec 01 12:08:48 crc kubenswrapper[4958]: I1201 12:08:48.794794 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l5mf6" podStartSLOduration=2.980182106 podStartE2EDuration="5.794765339s" podCreationTimestamp="2025-12-01 12:08:43 +0000 UTC" firstStartedPulling="2025-12-01 12:08:45.711998028 +0000 UTC m=+7773.220787075" lastFinishedPulling="2025-12-01 12:08:48.526581271 +0000 UTC m=+7776.035370308" observedRunningTime="2025-12-01 12:08:48.79163792 +0000 UTC m=+7776.300426957" watchObservedRunningTime="2025-12-01 12:08:48.794765339 +0000 UTC m=+7776.303554406" Dec 01 12:08:53 crc kubenswrapper[4958]: I1201 12:08:53.797084 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:53 crc kubenswrapper[4958]: I1201 12:08:53.834460 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:53 crc kubenswrapper[4958]: I1201 12:08:53.883711 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:54 crc kubenswrapper[4958]: I1201 12:08:54.956921 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:57 crc kubenswrapper[4958]: I1201 12:08:57.469656 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5mf6"] Dec 01 12:08:57 crc kubenswrapper[4958]: I1201 12:08:57.471287 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l5mf6" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="registry-server" containerID="cri-o://46d1e900f3d86f7ce8a873e3196ab659190906df505dfeb08eb0f1dd178b4159" gracePeriod=2 Dec 01 12:08:57 crc kubenswrapper[4958]: I1201 12:08:57.906921 4958 generic.go:334] "Generic (PLEG): container finished" podID="e32b0425-6c91-4a4f-9e93-541888363a16" containerID="46d1e900f3d86f7ce8a873e3196ab659190906df505dfeb08eb0f1dd178b4159" exitCode=0 Dec 01 12:08:57 crc kubenswrapper[4958]: I1201 12:08:57.907409 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerDied","Data":"46d1e900f3d86f7ce8a873e3196ab659190906df505dfeb08eb0f1dd178b4159"} Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.101110 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.170824 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8lpq\" (UniqueName: \"kubernetes.io/projected/e32b0425-6c91-4a4f-9e93-541888363a16-kube-api-access-q8lpq\") pod \"e32b0425-6c91-4a4f-9e93-541888363a16\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.170945 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-catalog-content\") pod \"e32b0425-6c91-4a4f-9e93-541888363a16\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.171020 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-utilities\") pod \"e32b0425-6c91-4a4f-9e93-541888363a16\" (UID: \"e32b0425-6c91-4a4f-9e93-541888363a16\") " Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.172282 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-utilities" (OuterVolumeSpecName: "utilities") pod "e32b0425-6c91-4a4f-9e93-541888363a16" (UID: "e32b0425-6c91-4a4f-9e93-541888363a16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.178584 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32b0425-6c91-4a4f-9e93-541888363a16-kube-api-access-q8lpq" (OuterVolumeSpecName: "kube-api-access-q8lpq") pod "e32b0425-6c91-4a4f-9e93-541888363a16" (UID: "e32b0425-6c91-4a4f-9e93-541888363a16"). InnerVolumeSpecName "kube-api-access-q8lpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.197629 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e32b0425-6c91-4a4f-9e93-541888363a16" (UID: "e32b0425-6c91-4a4f-9e93-541888363a16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.211043 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.211372 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.211566 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.214210 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.214441 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" gracePeriod=600 Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.273873 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.274160 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8lpq\" (UniqueName: \"kubernetes.io/projected/e32b0425-6c91-4a4f-9e93-541888363a16-kube-api-access-q8lpq\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.274256 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e32b0425-6c91-4a4f-9e93-541888363a16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:08:58 crc kubenswrapper[4958]: E1201 12:08:58.357075 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.920961 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" exitCode=0 Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.921011 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46"} Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.921433 4958 scope.go:117] "RemoveContainer" containerID="96dc7e42045ee9316b4e012dfce125a32a47cbfca9463ecd54f2bc641ba99e3c" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.922290 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:08:58 crc kubenswrapper[4958]: E1201 12:08:58.922825 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.924929 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5mf6" event={"ID":"e32b0425-6c91-4a4f-9e93-541888363a16","Type":"ContainerDied","Data":"1b4efbf7382263f26487010f3b28ce2dd58d15831b6e5116adacd60d45f99665"} Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.925043 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5mf6" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.985342 4958 scope.go:117] "RemoveContainer" containerID="46d1e900f3d86f7ce8a873e3196ab659190906df505dfeb08eb0f1dd178b4159" Dec 01 12:08:58 crc kubenswrapper[4958]: I1201 12:08:58.992972 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5mf6"] Dec 01 12:08:59 crc kubenswrapper[4958]: I1201 12:08:59.004455 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5mf6"] Dec 01 12:08:59 crc kubenswrapper[4958]: I1201 12:08:59.010739 4958 scope.go:117] "RemoveContainer" containerID="ed64081557db405409812af81b345d37b88e80d703611c4bbc9d99326cf67227" Dec 01 12:08:59 crc kubenswrapper[4958]: I1201 12:08:59.044290 4958 scope.go:117] "RemoveContainer" containerID="43575940ae3a0925939b6a7102ee1a134d7b89d18dc0e26363fcbd951d942d91" Dec 01 12:08:59 crc kubenswrapper[4958]: I1201 12:08:59.823121 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" path="/var/lib/kubelet/pods/e32b0425-6c91-4a4f-9e93-541888363a16/volumes" Dec 01 12:09:10 crc kubenswrapper[4958]: I1201 12:09:10.797976 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:09:10 crc kubenswrapper[4958]: E1201 12:09:10.798968 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:09:21 crc kubenswrapper[4958]: I1201 12:09:21.798699 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:09:21 crc kubenswrapper[4958]: E1201 12:09:21.800460 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:09:35 crc kubenswrapper[4958]: I1201 12:09:35.798130 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:09:35 crc kubenswrapper[4958]: E1201 12:09:35.799226 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:09:48 crc kubenswrapper[4958]: I1201 12:09:48.798675 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:09:48 crc kubenswrapper[4958]: E1201 12:09:48.799739 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:10:01 crc kubenswrapper[4958]: I1201 12:10:01.797645 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:10:01 crc kubenswrapper[4958]: E1201 12:10:01.798424 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:10:04 crc kubenswrapper[4958]: I1201 12:10:04.806036 4958 generic.go:334] "Generic (PLEG): container finished" podID="f6d924a1-e8ba-427c-bcd0-7f171676c2d2" containerID="c93a2fdbe310423479688d5b2d5cd38d5c916469e26ba549ef1dac5911da092f" exitCode=0 Dec 01 12:10:04 crc kubenswrapper[4958]: I1201 12:10:04.806124 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" event={"ID":"f6d924a1-e8ba-427c-bcd0-7f171676c2d2","Type":"ContainerDied","Data":"c93a2fdbe310423479688d5b2d5cd38d5c916469e26ba549ef1dac5911da092f"} Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.378050 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.517257 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdx9t\" (UniqueName: \"kubernetes.io/projected/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-kube-api-access-jdx9t\") pod \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.517377 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-bootstrap-combined-ca-bundle\") pod \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.517493 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ssh-key\") pod \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.517534 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ceph\") pod \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.517683 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-inventory\") pod \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\" (UID: \"f6d924a1-e8ba-427c-bcd0-7f171676c2d2\") " Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.526244 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f6d924a1-e8ba-427c-bcd0-7f171676c2d2" (UID: "f6d924a1-e8ba-427c-bcd0-7f171676c2d2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.526295 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ceph" (OuterVolumeSpecName: "ceph") pod "f6d924a1-e8ba-427c-bcd0-7f171676c2d2" (UID: "f6d924a1-e8ba-427c-bcd0-7f171676c2d2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.540379 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-kube-api-access-jdx9t" (OuterVolumeSpecName: "kube-api-access-jdx9t") pod "f6d924a1-e8ba-427c-bcd0-7f171676c2d2" (UID: "f6d924a1-e8ba-427c-bcd0-7f171676c2d2"). InnerVolumeSpecName "kube-api-access-jdx9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.548527 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6d924a1-e8ba-427c-bcd0-7f171676c2d2" (UID: "f6d924a1-e8ba-427c-bcd0-7f171676c2d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.553787 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-inventory" (OuterVolumeSpecName: "inventory") pod "f6d924a1-e8ba-427c-bcd0-7f171676c2d2" (UID: "f6d924a1-e8ba-427c-bcd0-7f171676c2d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.619746 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdx9t\" (UniqueName: \"kubernetes.io/projected/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-kube-api-access-jdx9t\") on node \"crc\" DevicePath \"\"" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.619782 4958 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.619792 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.619803 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.619811 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6d924a1-e8ba-427c-bcd0-7f171676c2d2-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.832703 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" event={"ID":"f6d924a1-e8ba-427c-bcd0-7f171676c2d2","Type":"ContainerDied","Data":"adec0ca7893d2cd0db5c3e3ded31b9a755439b5bacb9d7fa1f3f77dabb1d31f8"} Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.832764 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adec0ca7893d2cd0db5c3e3ded31b9a755439b5bacb9d7fa1f3f77dabb1d31f8" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.832783 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pw7rf" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.934146 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wr775"] Dec 01 12:10:06 crc kubenswrapper[4958]: E1201 12:10:06.934870 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="extract-content" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.934890 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="extract-content" Dec 01 12:10:06 crc kubenswrapper[4958]: E1201 12:10:06.934933 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d924a1-e8ba-427c-bcd0-7f171676c2d2" containerName="bootstrap-openstack-openstack-cell1" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.934942 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d924a1-e8ba-427c-bcd0-7f171676c2d2" containerName="bootstrap-openstack-openstack-cell1" Dec 01 12:10:06 crc kubenswrapper[4958]: E1201 12:10:06.934974 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="registry-server" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.934985 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="registry-server" Dec 01 12:10:06 crc kubenswrapper[4958]: E1201 12:10:06.935010 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="extract-utilities" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.935021 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="extract-utilities" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.935302 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d924a1-e8ba-427c-bcd0-7f171676c2d2" containerName="bootstrap-openstack-openstack-cell1" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.935330 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e32b0425-6c91-4a4f-9e93-541888363a16" containerName="registry-server" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.936481 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.940860 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.941338 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.942041 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.942063 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:10:06 crc kubenswrapper[4958]: I1201 12:10:06.946488 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wr775"] Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.028913 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ssh-key\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.028976 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ceph\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.029064 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-inventory\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.029379 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4vb\" (UniqueName: \"kubernetes.io/projected/2785029f-0dd7-4be8-8e87-e362c8cc7b09-kube-api-access-vk4vb\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.131739 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ssh-key\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.131804 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ceph\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.131869 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-inventory\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.131920 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4vb\" (UniqueName: \"kubernetes.io/projected/2785029f-0dd7-4be8-8e87-e362c8cc7b09-kube-api-access-vk4vb\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.139690 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ceph\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.140730 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-inventory\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.153366 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4vb\" (UniqueName: \"kubernetes.io/projected/2785029f-0dd7-4be8-8e87-e362c8cc7b09-kube-api-access-vk4vb\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.153772 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ssh-key\") pod \"download-cache-openstack-openstack-cell1-wr775\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.267991 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:10:07 crc kubenswrapper[4958]: I1201 12:10:07.943312 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wr775"] Dec 01 12:10:07 crc kubenswrapper[4958]: W1201 12:10:07.950214 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2785029f_0dd7_4be8_8e87_e362c8cc7b09.slice/crio-f095fcaadf835e60304ad034d39941536768e5c1d661e73fbcb32f61fbb54cfc WatchSource:0}: Error finding container f095fcaadf835e60304ad034d39941536768e5c1d661e73fbcb32f61fbb54cfc: Status 404 returned error can't find the container with id f095fcaadf835e60304ad034d39941536768e5c1d661e73fbcb32f61fbb54cfc Dec 01 12:10:08 crc kubenswrapper[4958]: I1201 12:10:08.860496 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wr775" event={"ID":"2785029f-0dd7-4be8-8e87-e362c8cc7b09","Type":"ContainerStarted","Data":"730da93e0163686a78599778e45fca5441377f40c856407794e79bbde492ddaf"} Dec 01 12:10:08 crc kubenswrapper[4958]: I1201 12:10:08.861466 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wr775" event={"ID":"2785029f-0dd7-4be8-8e87-e362c8cc7b09","Type":"ContainerStarted","Data":"f095fcaadf835e60304ad034d39941536768e5c1d661e73fbcb32f61fbb54cfc"} Dec 01 12:10:08 crc kubenswrapper[4958]: I1201 12:10:08.896879 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-wr775" podStartSLOduration=2.647723191 podStartE2EDuration="2.89682943s" podCreationTimestamp="2025-12-01 12:10:06 +0000 UTC" firstStartedPulling="2025-12-01 12:10:07.952460987 +0000 UTC m=+7855.461250014" lastFinishedPulling="2025-12-01 12:10:08.201567176 +0000 UTC m=+7855.710356253" observedRunningTime="2025-12-01 12:10:08.884360227 +0000 UTC m=+7856.393149304" watchObservedRunningTime="2025-12-01 12:10:08.89682943 +0000 UTC m=+7856.405618507" Dec 01 12:10:14 crc kubenswrapper[4958]: I1201 12:10:14.799679 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:10:14 crc kubenswrapper[4958]: E1201 12:10:14.800761 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:10:26 crc kubenswrapper[4958]: I1201 12:10:26.866695 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:10:26 crc kubenswrapper[4958]: E1201 12:10:26.868401 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.326022 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xg7g8"] Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.333819 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.370966 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xg7g8"] Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.416112 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-utilities\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.416458 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-catalog-content\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.416627 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb789\" (UniqueName: \"kubernetes.io/projected/9daa6334-5ca7-493c-8131-12c38bb3657b-kube-api-access-nb789\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.519499 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-utilities\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.519727 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-catalog-content\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.519753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb789\" (UniqueName: \"kubernetes.io/projected/9daa6334-5ca7-493c-8131-12c38bb3657b-kube-api-access-nb789\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.520531 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-utilities\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.520970 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-catalog-content\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.543086 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb789\" (UniqueName: \"kubernetes.io/projected/9daa6334-5ca7-493c-8131-12c38bb3657b-kube-api-access-nb789\") pod \"redhat-operators-xg7g8\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:38 crc kubenswrapper[4958]: I1201 12:10:38.710325 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:39 crc kubenswrapper[4958]: I1201 12:10:39.263516 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xg7g8"] Dec 01 12:10:39 crc kubenswrapper[4958]: I1201 12:10:39.471911 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerStarted","Data":"917542e4c411cb81267dd2146c2170b5767e6d4366579430fae44ae3e982917f"} Dec 01 12:10:40 crc kubenswrapper[4958]: I1201 12:10:40.481943 4958 generic.go:334] "Generic (PLEG): container finished" podID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerID="aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9" exitCode=0 Dec 01 12:10:40 crc kubenswrapper[4958]: I1201 12:10:40.482024 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerDied","Data":"aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9"} Dec 01 12:10:40 crc kubenswrapper[4958]: I1201 12:10:40.798275 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:10:40 crc kubenswrapper[4958]: E1201 12:10:40.798732 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:10:42 crc kubenswrapper[4958]: I1201 12:10:42.596198 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerStarted","Data":"d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f"} Dec 01 12:10:45 crc kubenswrapper[4958]: I1201 12:10:45.648961 4958 generic.go:334] "Generic (PLEG): container finished" podID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerID="d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f" exitCode=0 Dec 01 12:10:45 crc kubenswrapper[4958]: I1201 12:10:45.649061 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerDied","Data":"d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f"} Dec 01 12:10:46 crc kubenswrapper[4958]: I1201 12:10:46.665341 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerStarted","Data":"716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d"} Dec 01 12:10:46 crc kubenswrapper[4958]: I1201 12:10:46.687009 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xg7g8" podStartSLOduration=3.027823061 podStartE2EDuration="8.686991695s" podCreationTimestamp="2025-12-01 12:10:38 +0000 UTC" firstStartedPulling="2025-12-01 12:10:40.484601809 +0000 UTC m=+7887.993390846" lastFinishedPulling="2025-12-01 12:10:46.143770443 +0000 UTC m=+7893.652559480" observedRunningTime="2025-12-01 12:10:46.681297564 +0000 UTC m=+7894.190086611" watchObservedRunningTime="2025-12-01 12:10:46.686991695 +0000 UTC m=+7894.195780732" Dec 01 12:10:48 crc kubenswrapper[4958]: I1201 12:10:48.710865 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:48 crc kubenswrapper[4958]: I1201 12:10:48.711241 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:49 crc kubenswrapper[4958]: I1201 12:10:49.788065 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xg7g8" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="registry-server" probeResult="failure" output=< Dec 01 12:10:49 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:10:49 crc kubenswrapper[4958]: > Dec 01 12:10:53 crc kubenswrapper[4958]: I1201 12:10:53.811283 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:10:53 crc kubenswrapper[4958]: E1201 12:10:53.812190 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:10:58 crc kubenswrapper[4958]: I1201 12:10:58.790772 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:58 crc kubenswrapper[4958]: I1201 12:10:58.868065 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:10:59 crc kubenswrapper[4958]: I1201 12:10:59.058013 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xg7g8"] Dec 01 12:10:59 crc kubenswrapper[4958]: I1201 12:10:59.921572 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xg7g8" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="registry-server" containerID="cri-o://716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d" gracePeriod=2 Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.544712 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.667274 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb789\" (UniqueName: \"kubernetes.io/projected/9daa6334-5ca7-493c-8131-12c38bb3657b-kube-api-access-nb789\") pod \"9daa6334-5ca7-493c-8131-12c38bb3657b\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.667495 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-catalog-content\") pod \"9daa6334-5ca7-493c-8131-12c38bb3657b\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.667599 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-utilities\") pod \"9daa6334-5ca7-493c-8131-12c38bb3657b\" (UID: \"9daa6334-5ca7-493c-8131-12c38bb3657b\") " Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.669513 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-utilities" (OuterVolumeSpecName: "utilities") pod "9daa6334-5ca7-493c-8131-12c38bb3657b" (UID: "9daa6334-5ca7-493c-8131-12c38bb3657b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.705297 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daa6334-5ca7-493c-8131-12c38bb3657b-kube-api-access-nb789" (OuterVolumeSpecName: "kube-api-access-nb789") pod "9daa6334-5ca7-493c-8131-12c38bb3657b" (UID: "9daa6334-5ca7-493c-8131-12c38bb3657b"). InnerVolumeSpecName "kube-api-access-nb789". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.772859 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.772921 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb789\" (UniqueName: \"kubernetes.io/projected/9daa6334-5ca7-493c-8131-12c38bb3657b-kube-api-access-nb789\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.855432 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9daa6334-5ca7-493c-8131-12c38bb3657b" (UID: "9daa6334-5ca7-493c-8131-12c38bb3657b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.874621 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9daa6334-5ca7-493c-8131-12c38bb3657b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.931379 4958 generic.go:334] "Generic (PLEG): container finished" podID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerID="716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d" exitCode=0 Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.931426 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerDied","Data":"716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d"} Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.931453 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xg7g8" event={"ID":"9daa6334-5ca7-493c-8131-12c38bb3657b","Type":"ContainerDied","Data":"917542e4c411cb81267dd2146c2170b5767e6d4366579430fae44ae3e982917f"} Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.931470 4958 scope.go:117] "RemoveContainer" containerID="716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.931467 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xg7g8" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.950864 4958 scope.go:117] "RemoveContainer" containerID="d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f" Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.966156 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xg7g8"] Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.972804 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xg7g8"] Dec 01 12:11:00 crc kubenswrapper[4958]: I1201 12:11:00.988633 4958 scope.go:117] "RemoveContainer" containerID="aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.014346 4958 scope.go:117] "RemoveContainer" containerID="716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d" Dec 01 12:11:01 crc kubenswrapper[4958]: E1201 12:11:01.014755 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d\": container with ID starting with 716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d not found: ID does not exist" containerID="716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.014802 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d"} err="failed to get container status \"716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d\": rpc error: code = NotFound desc = could not find container \"716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d\": container with ID starting with 716fb6e46e815e2269bc44cc11d08d7a2ca1dc6d30703ec621a3a001b1701d9d not found: ID does not exist" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.014833 4958 scope.go:117] "RemoveContainer" containerID="d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f" Dec 01 12:11:01 crc kubenswrapper[4958]: E1201 12:11:01.015185 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f\": container with ID starting with d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f not found: ID does not exist" containerID="d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.015209 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f"} err="failed to get container status \"d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f\": rpc error: code = NotFound desc = could not find container \"d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f\": container with ID starting with d0ba374dd3348f805b5546786a0dab3be3d80655ff12bb40454b6f892af8c37f not found: ID does not exist" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.015223 4958 scope.go:117] "RemoveContainer" containerID="aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9" Dec 01 12:11:01 crc kubenswrapper[4958]: E1201 12:11:01.015486 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9\": container with ID starting with aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9 not found: ID does not exist" containerID="aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.015505 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9"} err="failed to get container status \"aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9\": rpc error: code = NotFound desc = could not find container \"aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9\": container with ID starting with aa6dc2f405afabadb1ec4bcd68bef2e9915522fd0b91b10bdd15e3f29c44fdd9 not found: ID does not exist" Dec 01 12:11:01 crc kubenswrapper[4958]: I1201 12:11:01.815692 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" path="/var/lib/kubelet/pods/9daa6334-5ca7-493c-8131-12c38bb3657b/volumes" Dec 01 12:11:04 crc kubenswrapper[4958]: I1201 12:11:04.798108 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:11:04 crc kubenswrapper[4958]: E1201 12:11:04.799266 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:11:16 crc kubenswrapper[4958]: I1201 12:11:16.797466 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:11:16 crc kubenswrapper[4958]: E1201 12:11:16.798366 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:11:30 crc kubenswrapper[4958]: I1201 12:11:30.798947 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:11:30 crc kubenswrapper[4958]: E1201 12:11:30.800450 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:11:42 crc kubenswrapper[4958]: I1201 12:11:42.799109 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:11:42 crc kubenswrapper[4958]: E1201 12:11:42.800224 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:11:48 crc kubenswrapper[4958]: I1201 12:11:48.641197 4958 generic.go:334] "Generic (PLEG): container finished" podID="2785029f-0dd7-4be8-8e87-e362c8cc7b09" containerID="730da93e0163686a78599778e45fca5441377f40c856407794e79bbde492ddaf" exitCode=0 Dec 01 12:11:48 crc kubenswrapper[4958]: I1201 12:11:48.641343 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wr775" event={"ID":"2785029f-0dd7-4be8-8e87-e362c8cc7b09","Type":"ContainerDied","Data":"730da93e0163686a78599778e45fca5441377f40c856407794e79bbde492ddaf"} Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.259802 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.353134 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ceph\") pod \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.353337 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk4vb\" (UniqueName: \"kubernetes.io/projected/2785029f-0dd7-4be8-8e87-e362c8cc7b09-kube-api-access-vk4vb\") pod \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.353451 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ssh-key\") pod \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.353469 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-inventory\") pod \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\" (UID: \"2785029f-0dd7-4be8-8e87-e362c8cc7b09\") " Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.359388 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2785029f-0dd7-4be8-8e87-e362c8cc7b09-kube-api-access-vk4vb" (OuterVolumeSpecName: "kube-api-access-vk4vb") pod "2785029f-0dd7-4be8-8e87-e362c8cc7b09" (UID: "2785029f-0dd7-4be8-8e87-e362c8cc7b09"). InnerVolumeSpecName "kube-api-access-vk4vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.370752 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ceph" (OuterVolumeSpecName: "ceph") pod "2785029f-0dd7-4be8-8e87-e362c8cc7b09" (UID: "2785029f-0dd7-4be8-8e87-e362c8cc7b09"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.392484 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-inventory" (OuterVolumeSpecName: "inventory") pod "2785029f-0dd7-4be8-8e87-e362c8cc7b09" (UID: "2785029f-0dd7-4be8-8e87-e362c8cc7b09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.407387 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2785029f-0dd7-4be8-8e87-e362c8cc7b09" (UID: "2785029f-0dd7-4be8-8e87-e362c8cc7b09"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.457077 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk4vb\" (UniqueName: \"kubernetes.io/projected/2785029f-0dd7-4be8-8e87-e362c8cc7b09-kube-api-access-vk4vb\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.457132 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.457156 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.457175 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2785029f-0dd7-4be8-8e87-e362c8cc7b09-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.677224 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wr775" event={"ID":"2785029f-0dd7-4be8-8e87-e362c8cc7b09","Type":"ContainerDied","Data":"f095fcaadf835e60304ad034d39941536768e5c1d661e73fbcb32f61fbb54cfc"} Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.677493 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f095fcaadf835e60304ad034d39941536768e5c1d661e73fbcb32f61fbb54cfc" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.677296 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wr775" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.791140 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-gn87g"] Dec 01 12:11:50 crc kubenswrapper[4958]: E1201 12:11:50.791792 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="extract-content" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.791872 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="extract-content" Dec 01 12:11:50 crc kubenswrapper[4958]: E1201 12:11:50.791912 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="registry-server" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.791918 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="registry-server" Dec 01 12:11:50 crc kubenswrapper[4958]: E1201 12:11:50.791942 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="extract-utilities" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.791949 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="extract-utilities" Dec 01 12:11:50 crc kubenswrapper[4958]: E1201 12:11:50.791977 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2785029f-0dd7-4be8-8e87-e362c8cc7b09" containerName="download-cache-openstack-openstack-cell1" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.791984 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2785029f-0dd7-4be8-8e87-e362c8cc7b09" containerName="download-cache-openstack-openstack-cell1" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.792399 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9daa6334-5ca7-493c-8131-12c38bb3657b" containerName="registry-server" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.792439 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2785029f-0dd7-4be8-8e87-e362c8cc7b09" containerName="download-cache-openstack-openstack-cell1" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.793327 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.800255 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.800435 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.800588 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.800909 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.804990 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-gn87g"] Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.973275 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-inventory\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.974088 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ssh-key\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.975451 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ceph\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:50 crc kubenswrapper[4958]: I1201 12:11:50.975519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndwk\" (UniqueName: \"kubernetes.io/projected/a9539b9c-0548-46ae-9c62-234f5bf45564-kube-api-access-tndwk\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.077707 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ceph\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.077792 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tndwk\" (UniqueName: \"kubernetes.io/projected/a9539b9c-0548-46ae-9c62-234f5bf45564-kube-api-access-tndwk\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.077997 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-inventory\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.078179 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ssh-key\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.082102 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-inventory\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.088489 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ssh-key\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.090249 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ceph\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.096915 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndwk\" (UniqueName: \"kubernetes.io/projected/a9539b9c-0548-46ae-9c62-234f5bf45564-kube-api-access-tndwk\") pod \"configure-network-openstack-openstack-cell1-gn87g\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.116090 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:11:51 crc kubenswrapper[4958]: I1201 12:11:51.782406 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-gn87g"] Dec 01 12:11:51 crc kubenswrapper[4958]: W1201 12:11:51.797673 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9539b9c_0548_46ae_9c62_234f5bf45564.slice/crio-23849d10b5a1c7ba255c53a4c6b6e203c224d74059eabc59a5f9d8b9bb3b3fd9 WatchSource:0}: Error finding container 23849d10b5a1c7ba255c53a4c6b6e203c224d74059eabc59a5f9d8b9bb3b3fd9: Status 404 returned error can't find the container with id 23849d10b5a1c7ba255c53a4c6b6e203c224d74059eabc59a5f9d8b9bb3b3fd9 Dec 01 12:11:52 crc kubenswrapper[4958]: I1201 12:11:52.709423 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" event={"ID":"a9539b9c-0548-46ae-9c62-234f5bf45564","Type":"ContainerStarted","Data":"a84121cb488019c711675d90ae69db0a0d7482a7013ce12921d75d9bd2b0015c"} Dec 01 12:11:52 crc kubenswrapper[4958]: I1201 12:11:52.710114 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" event={"ID":"a9539b9c-0548-46ae-9c62-234f5bf45564","Type":"ContainerStarted","Data":"23849d10b5a1c7ba255c53a4c6b6e203c224d74059eabc59a5f9d8b9bb3b3fd9"} Dec 01 12:11:52 crc kubenswrapper[4958]: I1201 12:11:52.747331 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" podStartSLOduration=2.521206594 podStartE2EDuration="2.74724116s" podCreationTimestamp="2025-12-01 12:11:50 +0000 UTC" firstStartedPulling="2025-12-01 12:11:51.800249323 +0000 UTC m=+7959.309038370" lastFinishedPulling="2025-12-01 12:11:52.026283889 +0000 UTC m=+7959.535072936" observedRunningTime="2025-12-01 12:11:52.734089357 +0000 UTC m=+7960.242878434" watchObservedRunningTime="2025-12-01 12:11:52.74724116 +0000 UTC m=+7960.256030227" Dec 01 12:11:56 crc kubenswrapper[4958]: I1201 12:11:56.799083 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:11:56 crc kubenswrapper[4958]: E1201 12:11:56.799913 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:12:10 crc kubenswrapper[4958]: I1201 12:12:10.797811 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:12:10 crc kubenswrapper[4958]: E1201 12:12:10.798598 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:12:21 crc kubenswrapper[4958]: I1201 12:12:21.800605 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:12:21 crc kubenswrapper[4958]: E1201 12:12:21.801338 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:12:36 crc kubenswrapper[4958]: I1201 12:12:36.798179 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:12:36 crc kubenswrapper[4958]: E1201 12:12:36.799256 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:12:49 crc kubenswrapper[4958]: I1201 12:12:49.797799 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:12:49 crc kubenswrapper[4958]: E1201 12:12:49.798660 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:12:59 crc kubenswrapper[4958]: I1201 12:12:59.960767 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xs72w"] Dec 01 12:12:59 crc kubenswrapper[4958]: I1201 12:12:59.964072 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.007487 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs72w"] Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.104620 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-catalog-content\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.104697 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-utilities\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.104912 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7kkb\" (UniqueName: \"kubernetes.io/projected/fa80853a-9295-463d-b116-bf87600935b8-kube-api-access-z7kkb\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.206298 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-catalog-content\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.206389 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-utilities\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.206552 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7kkb\" (UniqueName: \"kubernetes.io/projected/fa80853a-9295-463d-b116-bf87600935b8-kube-api-access-z7kkb\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.207094 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-catalog-content\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.207192 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-utilities\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.235596 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7kkb\" (UniqueName: \"kubernetes.io/projected/fa80853a-9295-463d-b116-bf87600935b8-kube-api-access-z7kkb\") pod \"community-operators-xs72w\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.301243 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:00 crc kubenswrapper[4958]: I1201 12:13:00.886387 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs72w"] Dec 01 12:13:01 crc kubenswrapper[4958]: I1201 12:13:01.777799 4958 generic.go:334] "Generic (PLEG): container finished" podID="fa80853a-9295-463d-b116-bf87600935b8" containerID="d760916fb8775baf7983d5ee923b52c25026d36ad4276eb91476c7014c12208b" exitCode=0 Dec 01 12:13:01 crc kubenswrapper[4958]: I1201 12:13:01.777932 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerDied","Data":"d760916fb8775baf7983d5ee923b52c25026d36ad4276eb91476c7014c12208b"} Dec 01 12:13:01 crc kubenswrapper[4958]: I1201 12:13:01.778300 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerStarted","Data":"71b208a95879213ad9c911986f3ce3acf31a2ed1068cf4b1d0f52096e2ccfed9"} Dec 01 12:13:01 crc kubenswrapper[4958]: I1201 12:13:01.799044 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:13:01 crc kubenswrapper[4958]: E1201 12:13:01.799589 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:13:04 crc kubenswrapper[4958]: I1201 12:13:04.825230 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerStarted","Data":"20ecd1abd8223dcaa4dcc493e14f89bc15e7a6536da529068835e4897f0c77a7"} Dec 01 12:13:05 crc kubenswrapper[4958]: I1201 12:13:05.836602 4958 generic.go:334] "Generic (PLEG): container finished" podID="fa80853a-9295-463d-b116-bf87600935b8" containerID="20ecd1abd8223dcaa4dcc493e14f89bc15e7a6536da529068835e4897f0c77a7" exitCode=0 Dec 01 12:13:05 crc kubenswrapper[4958]: I1201 12:13:05.838589 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerDied","Data":"20ecd1abd8223dcaa4dcc493e14f89bc15e7a6536da529068835e4897f0c77a7"} Dec 01 12:13:06 crc kubenswrapper[4958]: I1201 12:13:06.852376 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerStarted","Data":"2e21ea81a2e6fee9abeeeb3f39ed84bf31860f2d145af7b8d9b74f51c9c3f8d7"} Dec 01 12:13:06 crc kubenswrapper[4958]: I1201 12:13:06.891008 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xs72w" podStartSLOduration=3.328053786 podStartE2EDuration="7.890977351s" podCreationTimestamp="2025-12-01 12:12:59 +0000 UTC" firstStartedPulling="2025-12-01 12:13:01.782542601 +0000 UTC m=+8029.291331678" lastFinishedPulling="2025-12-01 12:13:06.345466166 +0000 UTC m=+8033.854255243" observedRunningTime="2025-12-01 12:13:06.877453138 +0000 UTC m=+8034.386242185" watchObservedRunningTime="2025-12-01 12:13:06.890977351 +0000 UTC m=+8034.399766428" Dec 01 12:13:10 crc kubenswrapper[4958]: I1201 12:13:10.301531 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:10 crc kubenswrapper[4958]: I1201 12:13:10.302252 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:10 crc kubenswrapper[4958]: I1201 12:13:10.370772 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:15 crc kubenswrapper[4958]: I1201 12:13:15.798060 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:13:15 crc kubenswrapper[4958]: E1201 12:13:15.801294 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.732383 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-978hc"] Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.738276 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.754643 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-978hc"] Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.876466 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-catalog-content\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.876569 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktb7\" (UniqueName: \"kubernetes.io/projected/58294615-8fe3-4e63-8f19-0dd63300431e-kube-api-access-dktb7\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.876930 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-utilities\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.980090 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-catalog-content\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.980185 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktb7\" (UniqueName: \"kubernetes.io/projected/58294615-8fe3-4e63-8f19-0dd63300431e-kube-api-access-dktb7\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.980273 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-utilities\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.980553 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-catalog-content\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:19 crc kubenswrapper[4958]: I1201 12:13:19.980890 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-utilities\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:20 crc kubenswrapper[4958]: I1201 12:13:20.008012 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktb7\" (UniqueName: \"kubernetes.io/projected/58294615-8fe3-4e63-8f19-0dd63300431e-kube-api-access-dktb7\") pod \"certified-operators-978hc\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:20 crc kubenswrapper[4958]: I1201 12:13:20.095750 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:20 crc kubenswrapper[4958]: I1201 12:13:20.406297 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:20 crc kubenswrapper[4958]: I1201 12:13:20.705872 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-978hc"] Dec 01 12:13:21 crc kubenswrapper[4958]: I1201 12:13:21.043113 4958 generic.go:334] "Generic (PLEG): container finished" podID="58294615-8fe3-4e63-8f19-0dd63300431e" containerID="986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463" exitCode=0 Dec 01 12:13:21 crc kubenswrapper[4958]: I1201 12:13:21.043172 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerDied","Data":"986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463"} Dec 01 12:13:21 crc kubenswrapper[4958]: I1201 12:13:21.043561 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerStarted","Data":"3e95add4e8f851b4ea085eda4cea076f99b314cec01162b4ac905372c85e2382"} Dec 01 12:13:21 crc kubenswrapper[4958]: I1201 12:13:21.045733 4958 generic.go:334] "Generic (PLEG): container finished" podID="a9539b9c-0548-46ae-9c62-234f5bf45564" containerID="a84121cb488019c711675d90ae69db0a0d7482a7013ce12921d75d9bd2b0015c" exitCode=0 Dec 01 12:13:21 crc kubenswrapper[4958]: I1201 12:13:21.045778 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" event={"ID":"a9539b9c-0548-46ae-9c62-234f5bf45564","Type":"ContainerDied","Data":"a84121cb488019c711675d90ae69db0a0d7482a7013ce12921d75d9bd2b0015c"} Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.783685 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.806959 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xs72w"] Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.807248 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xs72w" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="registry-server" containerID="cri-o://2e21ea81a2e6fee9abeeeb3f39ed84bf31860f2d145af7b8d9b74f51c9c3f8d7" gracePeriod=2 Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.988665 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tndwk\" (UniqueName: \"kubernetes.io/projected/a9539b9c-0548-46ae-9c62-234f5bf45564-kube-api-access-tndwk\") pod \"a9539b9c-0548-46ae-9c62-234f5bf45564\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.988835 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-inventory\") pod \"a9539b9c-0548-46ae-9c62-234f5bf45564\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.988879 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ssh-key\") pod \"a9539b9c-0548-46ae-9c62-234f5bf45564\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " Dec 01 12:13:22 crc kubenswrapper[4958]: I1201 12:13:22.989017 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ceph\") pod \"a9539b9c-0548-46ae-9c62-234f5bf45564\" (UID: \"a9539b9c-0548-46ae-9c62-234f5bf45564\") " Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.012309 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ceph" (OuterVolumeSpecName: "ceph") pod "a9539b9c-0548-46ae-9c62-234f5bf45564" (UID: "a9539b9c-0548-46ae-9c62-234f5bf45564"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.021128 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9539b9c-0548-46ae-9c62-234f5bf45564-kube-api-access-tndwk" (OuterVolumeSpecName: "kube-api-access-tndwk") pod "a9539b9c-0548-46ae-9c62-234f5bf45564" (UID: "a9539b9c-0548-46ae-9c62-234f5bf45564"). InnerVolumeSpecName "kube-api-access-tndwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.071680 4958 generic.go:334] "Generic (PLEG): container finished" podID="fa80853a-9295-463d-b116-bf87600935b8" containerID="2e21ea81a2e6fee9abeeeb3f39ed84bf31860f2d145af7b8d9b74f51c9c3f8d7" exitCode=0 Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.071739 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerDied","Data":"2e21ea81a2e6fee9abeeeb3f39ed84bf31860f2d145af7b8d9b74f51c9c3f8d7"} Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.072784 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" event={"ID":"a9539b9c-0548-46ae-9c62-234f5bf45564","Type":"ContainerDied","Data":"23849d10b5a1c7ba255c53a4c6b6e203c224d74059eabc59a5f9d8b9bb3b3fd9"} Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.072809 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23849d10b5a1c7ba255c53a4c6b6e203c224d74059eabc59a5f9d8b9bb3b3fd9" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.073007 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gn87g" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.075106 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerStarted","Data":"36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9"} Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.075238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-inventory" (OuterVolumeSpecName: "inventory") pod "a9539b9c-0548-46ae-9c62-234f5bf45564" (UID: "a9539b9c-0548-46ae-9c62-234f5bf45564"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.093014 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.093059 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.093070 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tndwk\" (UniqueName: \"kubernetes.io/projected/a9539b9c-0548-46ae-9c62-234f5bf45564-kube-api-access-tndwk\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.094514 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a9539b9c-0548-46ae-9c62-234f5bf45564" (UID: "a9539b9c-0548-46ae-9c62-234f5bf45564"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.183080 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-t2zft"] Dec 01 12:13:23 crc kubenswrapper[4958]: E1201 12:13:23.183557 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9539b9c-0548-46ae-9c62-234f5bf45564" containerName="configure-network-openstack-openstack-cell1" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.183576 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9539b9c-0548-46ae-9c62-234f5bf45564" containerName="configure-network-openstack-openstack-cell1" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.184100 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9539b9c-0548-46ae-9c62-234f5bf45564" containerName="configure-network-openstack-openstack-cell1" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.185219 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.198533 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ceph\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.198588 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-inventory\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.198664 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ssh-key\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.198691 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbwj\" (UniqueName: \"kubernetes.io/projected/ceb759ad-1f62-4f94-8006-7cd251b3e36a-kube-api-access-gqbwj\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.198771 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9539b9c-0548-46ae-9c62-234f5bf45564-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.211790 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-t2zft"] Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.300996 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ceph\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.301058 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-inventory\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.301130 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ssh-key\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.301578 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbwj\" (UniqueName: \"kubernetes.io/projected/ceb759ad-1f62-4f94-8006-7cd251b3e36a-kube-api-access-gqbwj\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.311272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-inventory\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.311717 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ssh-key\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.321483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ceph\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.322348 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbwj\" (UniqueName: \"kubernetes.io/projected/ceb759ad-1f62-4f94-8006-7cd251b3e36a-kube-api-access-gqbwj\") pod \"validate-network-openstack-openstack-cell1-t2zft\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.481935 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.524559 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.607328 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7kkb\" (UniqueName: \"kubernetes.io/projected/fa80853a-9295-463d-b116-bf87600935b8-kube-api-access-z7kkb\") pod \"fa80853a-9295-463d-b116-bf87600935b8\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.607673 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-utilities\") pod \"fa80853a-9295-463d-b116-bf87600935b8\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.607738 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-catalog-content\") pod \"fa80853a-9295-463d-b116-bf87600935b8\" (UID: \"fa80853a-9295-463d-b116-bf87600935b8\") " Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.610474 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-utilities" (OuterVolumeSpecName: "utilities") pod "fa80853a-9295-463d-b116-bf87600935b8" (UID: "fa80853a-9295-463d-b116-bf87600935b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.612280 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa80853a-9295-463d-b116-bf87600935b8-kube-api-access-z7kkb" (OuterVolumeSpecName: "kube-api-access-z7kkb") pod "fa80853a-9295-463d-b116-bf87600935b8" (UID: "fa80853a-9295-463d-b116-bf87600935b8"). InnerVolumeSpecName "kube-api-access-z7kkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.696914 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa80853a-9295-463d-b116-bf87600935b8" (UID: "fa80853a-9295-463d-b116-bf87600935b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.714448 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.714486 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa80853a-9295-463d-b116-bf87600935b8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:23 crc kubenswrapper[4958]: I1201 12:13:23.714501 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7kkb\" (UniqueName: \"kubernetes.io/projected/fa80853a-9295-463d-b116-bf87600935b8-kube-api-access-z7kkb\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.100116 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs72w" Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.100118 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs72w" event={"ID":"fa80853a-9295-463d-b116-bf87600935b8","Type":"ContainerDied","Data":"71b208a95879213ad9c911986f3ce3acf31a2ed1068cf4b1d0f52096e2ccfed9"} Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.100394 4958 scope.go:117] "RemoveContainer" containerID="2e21ea81a2e6fee9abeeeb3f39ed84bf31860f2d145af7b8d9b74f51c9c3f8d7" Dec 01 12:13:24 crc kubenswrapper[4958]: W1201 12:13:24.101179 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceb759ad_1f62_4f94_8006_7cd251b3e36a.slice/crio-c7f0d32d6fbbe2997e6219b628be56ce9efbf855b0a9c4e6bd58dc3791ef71d2 WatchSource:0}: Error finding container c7f0d32d6fbbe2997e6219b628be56ce9efbf855b0a9c4e6bd58dc3791ef71d2: Status 404 returned error can't find the container with id c7f0d32d6fbbe2997e6219b628be56ce9efbf855b0a9c4e6bd58dc3791ef71d2 Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.103215 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerDied","Data":"36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9"} Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.103451 4958 generic.go:334] "Generic (PLEG): container finished" podID="58294615-8fe3-4e63-8f19-0dd63300431e" containerID="36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9" exitCode=0 Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.109414 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-t2zft"] Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.134950 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xs72w"] Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.144106 4958 scope.go:117] "RemoveContainer" containerID="20ecd1abd8223dcaa4dcc493e14f89bc15e7a6536da529068835e4897f0c77a7" Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.145204 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xs72w"] Dec 01 12:13:24 crc kubenswrapper[4958]: I1201 12:13:24.236282 4958 scope.go:117] "RemoveContainer" containerID="d760916fb8775baf7983d5ee923b52c25026d36ad4276eb91476c7014c12208b" Dec 01 12:13:25 crc kubenswrapper[4958]: I1201 12:13:25.153514 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerStarted","Data":"ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e"} Dec 01 12:13:25 crc kubenswrapper[4958]: I1201 12:13:25.162572 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" event={"ID":"ceb759ad-1f62-4f94-8006-7cd251b3e36a","Type":"ContainerStarted","Data":"0de550441130ecb9eabfdeb2b2f0410a03ff2ffbcbec7e24a023bc05e6eb2f02"} Dec 01 12:13:25 crc kubenswrapper[4958]: I1201 12:13:25.162619 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" event={"ID":"ceb759ad-1f62-4f94-8006-7cd251b3e36a","Type":"ContainerStarted","Data":"c7f0d32d6fbbe2997e6219b628be56ce9efbf855b0a9c4e6bd58dc3791ef71d2"} Dec 01 12:13:25 crc kubenswrapper[4958]: I1201 12:13:25.200376 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-978hc" podStartSLOduration=2.978375389 podStartE2EDuration="6.20034422s" podCreationTimestamp="2025-12-01 12:13:19 +0000 UTC" firstStartedPulling="2025-12-01 12:13:21.045338458 +0000 UTC m=+8048.554127495" lastFinishedPulling="2025-12-01 12:13:24.267307289 +0000 UTC m=+8051.776096326" observedRunningTime="2025-12-01 12:13:25.185588902 +0000 UTC m=+8052.694377939" watchObservedRunningTime="2025-12-01 12:13:25.20034422 +0000 UTC m=+8052.709133257" Dec 01 12:13:25 crc kubenswrapper[4958]: I1201 12:13:25.352171 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" podStartSLOduration=2.136258416 podStartE2EDuration="2.352148595s" podCreationTimestamp="2025-12-01 12:13:23 +0000 UTC" firstStartedPulling="2025-12-01 12:13:24.109358209 +0000 UTC m=+8051.618147246" lastFinishedPulling="2025-12-01 12:13:24.325248388 +0000 UTC m=+8051.834037425" observedRunningTime="2025-12-01 12:13:25.345761094 +0000 UTC m=+8052.854550131" watchObservedRunningTime="2025-12-01 12:13:25.352148595 +0000 UTC m=+8052.860937632" Dec 01 12:13:25 crc kubenswrapper[4958]: I1201 12:13:25.815797 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa80853a-9295-463d-b116-bf87600935b8" path="/var/lib/kubelet/pods/fa80853a-9295-463d-b116-bf87600935b8/volumes" Dec 01 12:13:30 crc kubenswrapper[4958]: I1201 12:13:30.095995 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:30 crc kubenswrapper[4958]: I1201 12:13:30.099936 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:30 crc kubenswrapper[4958]: I1201 12:13:30.178767 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:30 crc kubenswrapper[4958]: I1201 12:13:30.288633 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:30 crc kubenswrapper[4958]: I1201 12:13:30.424386 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-978hc"] Dec 01 12:13:30 crc kubenswrapper[4958]: I1201 12:13:30.797640 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:13:30 crc kubenswrapper[4958]: E1201 12:13:30.797965 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:13:31 crc kubenswrapper[4958]: I1201 12:13:31.243334 4958 generic.go:334] "Generic (PLEG): container finished" podID="ceb759ad-1f62-4f94-8006-7cd251b3e36a" containerID="0de550441130ecb9eabfdeb2b2f0410a03ff2ffbcbec7e24a023bc05e6eb2f02" exitCode=0 Dec 01 12:13:31 crc kubenswrapper[4958]: I1201 12:13:31.243383 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" event={"ID":"ceb759ad-1f62-4f94-8006-7cd251b3e36a","Type":"ContainerDied","Data":"0de550441130ecb9eabfdeb2b2f0410a03ff2ffbcbec7e24a023bc05e6eb2f02"} Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.252755 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-978hc" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="registry-server" containerID="cri-o://ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e" gracePeriod=2 Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.826003 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.831110 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.841299 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ceph\") pod \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.841921 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbwj\" (UniqueName: \"kubernetes.io/projected/ceb759ad-1f62-4f94-8006-7cd251b3e36a-kube-api-access-gqbwj\") pod \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.843469 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ssh-key\") pod \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.843565 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-inventory\") pod \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\" (UID: \"ceb759ad-1f62-4f94-8006-7cd251b3e36a\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.856120 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ceph" (OuterVolumeSpecName: "ceph") pod "ceb759ad-1f62-4f94-8006-7cd251b3e36a" (UID: "ceb759ad-1f62-4f94-8006-7cd251b3e36a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.856667 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb759ad-1f62-4f94-8006-7cd251b3e36a-kube-api-access-gqbwj" (OuterVolumeSpecName: "kube-api-access-gqbwj") pod "ceb759ad-1f62-4f94-8006-7cd251b3e36a" (UID: "ceb759ad-1f62-4f94-8006-7cd251b3e36a"). InnerVolumeSpecName "kube-api-access-gqbwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.907820 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-inventory" (OuterVolumeSpecName: "inventory") pod "ceb759ad-1f62-4f94-8006-7cd251b3e36a" (UID: "ceb759ad-1f62-4f94-8006-7cd251b3e36a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.911096 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ceb759ad-1f62-4f94-8006-7cd251b3e36a" (UID: "ceb759ad-1f62-4f94-8006-7cd251b3e36a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.946672 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-catalog-content\") pod \"58294615-8fe3-4e63-8f19-0dd63300431e\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.946962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-utilities\") pod \"58294615-8fe3-4e63-8f19-0dd63300431e\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.947209 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dktb7\" (UniqueName: \"kubernetes.io/projected/58294615-8fe3-4e63-8f19-0dd63300431e-kube-api-access-dktb7\") pod \"58294615-8fe3-4e63-8f19-0dd63300431e\" (UID: \"58294615-8fe3-4e63-8f19-0dd63300431e\") " Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.947871 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-utilities" (OuterVolumeSpecName: "utilities") pod "58294615-8fe3-4e63-8f19-0dd63300431e" (UID: "58294615-8fe3-4e63-8f19-0dd63300431e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.947953 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbwj\" (UniqueName: \"kubernetes.io/projected/ceb759ad-1f62-4f94-8006-7cd251b3e36a-kube-api-access-gqbwj\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.947969 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.947986 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.947996 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ceb759ad-1f62-4f94-8006-7cd251b3e36a-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:32 crc kubenswrapper[4958]: I1201 12:13:32.950370 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58294615-8fe3-4e63-8f19-0dd63300431e-kube-api-access-dktb7" (OuterVolumeSpecName: "kube-api-access-dktb7") pod "58294615-8fe3-4e63-8f19-0dd63300431e" (UID: "58294615-8fe3-4e63-8f19-0dd63300431e"). InnerVolumeSpecName "kube-api-access-dktb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.051295 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.051371 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dktb7\" (UniqueName: \"kubernetes.io/projected/58294615-8fe3-4e63-8f19-0dd63300431e-kube-api-access-dktb7\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.204960 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58294615-8fe3-4e63-8f19-0dd63300431e" (UID: "58294615-8fe3-4e63-8f19-0dd63300431e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.256908 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58294615-8fe3-4e63-8f19-0dd63300431e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.266233 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.266253 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-t2zft" event={"ID":"ceb759ad-1f62-4f94-8006-7cd251b3e36a","Type":"ContainerDied","Data":"c7f0d32d6fbbe2997e6219b628be56ce9efbf855b0a9c4e6bd58dc3791ef71d2"} Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.266293 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7f0d32d6fbbe2997e6219b628be56ce9efbf855b0a9c4e6bd58dc3791ef71d2" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.270334 4958 generic.go:334] "Generic (PLEG): container finished" podID="58294615-8fe3-4e63-8f19-0dd63300431e" containerID="ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e" exitCode=0 Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.270380 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerDied","Data":"ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e"} Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.270400 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-978hc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.270411 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-978hc" event={"ID":"58294615-8fe3-4e63-8f19-0dd63300431e","Type":"ContainerDied","Data":"3e95add4e8f851b4ea085eda4cea076f99b314cec01162b4ac905372c85e2382"} Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.270432 4958 scope.go:117] "RemoveContainer" containerID="ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.299906 4958 scope.go:117] "RemoveContainer" containerID="36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.348669 4958 scope.go:117] "RemoveContainer" containerID="986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.356931 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-978hc"] Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.368527 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-978hc"] Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.384612 4958 scope.go:117] "RemoveContainer" containerID="ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.385930 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e\": container with ID starting with ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e not found: ID does not exist" containerID="ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.385973 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e"} err="failed to get container status \"ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e\": rpc error: code = NotFound desc = could not find container \"ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e\": container with ID starting with ebdda9f4b6aecbd6cf5db60a6aef0dd228668d928e1b4d7465d62bde4bf9b05e not found: ID does not exist" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.386006 4958 scope.go:117] "RemoveContainer" containerID="36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.386402 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9\": container with ID starting with 36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9 not found: ID does not exist" containerID="36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.386432 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9"} err="failed to get container status \"36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9\": rpc error: code = NotFound desc = could not find container \"36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9\": container with ID starting with 36fca569d97f493308afd94b9c85cd44e161c6b8e8da31a244fc1d53078bfbb9 not found: ID does not exist" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.386451 4958 scope.go:117] "RemoveContainer" containerID="986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.386742 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463\": container with ID starting with 986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463 not found: ID does not exist" containerID="986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.386761 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463"} err="failed to get container status \"986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463\": rpc error: code = NotFound desc = could not find container \"986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463\": container with ID starting with 986c75fa1f7c3515d21c6a3f0f51b32beb429343a8ae54356ceef3f7ab0c5463 not found: ID does not exist" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.421396 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tq6pc"] Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422116 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="extract-utilities" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422145 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="extract-utilities" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422189 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="extract-utilities" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422198 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="extract-utilities" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422211 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="registry-server" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422220 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="registry-server" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422232 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="registry-server" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422240 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="registry-server" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422453 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="extract-content" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422461 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="extract-content" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422488 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb759ad-1f62-4f94-8006-7cd251b3e36a" containerName="validate-network-openstack-openstack-cell1" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422496 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb759ad-1f62-4f94-8006-7cd251b3e36a" containerName="validate-network-openstack-openstack-cell1" Dec 01 12:13:33 crc kubenswrapper[4958]: E1201 12:13:33.422510 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="extract-content" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422518 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="extract-content" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422836 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" containerName="registry-server" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.422982 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb759ad-1f62-4f94-8006-7cd251b3e36a" containerName="validate-network-openstack-openstack-cell1" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.423001 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa80853a-9295-463d-b116-bf87600935b8" containerName="registry-server" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.424380 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.428883 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.429163 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.429337 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.429507 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.436472 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tq6pc"] Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.484263 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-inventory\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.484586 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ceph\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.484648 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk8dc\" (UniqueName: \"kubernetes.io/projected/752669c4-9b31-4db0-9e8b-27252be3fda5-kube-api-access-gk8dc\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.484839 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ssh-key\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.587517 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ceph\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.587606 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk8dc\" (UniqueName: \"kubernetes.io/projected/752669c4-9b31-4db0-9e8b-27252be3fda5-kube-api-access-gk8dc\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.588177 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ssh-key\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.589052 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-inventory\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.597378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ceph\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.598403 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ssh-key\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.599280 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-inventory\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.606708 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk8dc\" (UniqueName: \"kubernetes.io/projected/752669c4-9b31-4db0-9e8b-27252be3fda5-kube-api-access-gk8dc\") pod \"install-os-openstack-openstack-cell1-tq6pc\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.801935 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:13:33 crc kubenswrapper[4958]: I1201 12:13:33.817130 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58294615-8fe3-4e63-8f19-0dd63300431e" path="/var/lib/kubelet/pods/58294615-8fe3-4e63-8f19-0dd63300431e/volumes" Dec 01 12:13:34 crc kubenswrapper[4958]: I1201 12:13:34.244097 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tq6pc"] Dec 01 12:13:34 crc kubenswrapper[4958]: W1201 12:13:34.247355 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod752669c4_9b31_4db0_9e8b_27252be3fda5.slice/crio-04c3648c92f45794ee56ca4cc65eab3e08e87d0706ae2b9962bcbccd639708bc WatchSource:0}: Error finding container 04c3648c92f45794ee56ca4cc65eab3e08e87d0706ae2b9962bcbccd639708bc: Status 404 returned error can't find the container with id 04c3648c92f45794ee56ca4cc65eab3e08e87d0706ae2b9962bcbccd639708bc Dec 01 12:13:34 crc kubenswrapper[4958]: I1201 12:13:34.280231 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" event={"ID":"752669c4-9b31-4db0-9e8b-27252be3fda5","Type":"ContainerStarted","Data":"04c3648c92f45794ee56ca4cc65eab3e08e87d0706ae2b9962bcbccd639708bc"} Dec 01 12:13:35 crc kubenswrapper[4958]: I1201 12:13:35.295457 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" event={"ID":"752669c4-9b31-4db0-9e8b-27252be3fda5","Type":"ContainerStarted","Data":"2a089b3a8d120a7bf3f95eb4d493a3828191054204cc9641cd84d4f7d5f34807"} Dec 01 12:13:35 crc kubenswrapper[4958]: I1201 12:13:35.329550 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" podStartSLOduration=2.119545067 podStartE2EDuration="2.329525359s" podCreationTimestamp="2025-12-01 12:13:33 +0000 UTC" firstStartedPulling="2025-12-01 12:13:34.249752406 +0000 UTC m=+8061.758541443" lastFinishedPulling="2025-12-01 12:13:34.459732688 +0000 UTC m=+8061.968521735" observedRunningTime="2025-12-01 12:13:35.328401007 +0000 UTC m=+8062.837190044" watchObservedRunningTime="2025-12-01 12:13:35.329525359 +0000 UTC m=+8062.838314436" Dec 01 12:13:42 crc kubenswrapper[4958]: I1201 12:13:42.798823 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:13:42 crc kubenswrapper[4958]: E1201 12:13:42.802319 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:13:53 crc kubenswrapper[4958]: I1201 12:13:53.811918 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:13:53 crc kubenswrapper[4958]: E1201 12:13:53.814170 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:14:05 crc kubenswrapper[4958]: I1201 12:14:05.798074 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:14:06 crc kubenswrapper[4958]: I1201 12:14:06.754518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"acaaa06729a7596508194b634486b60d2d4a6b7ee287eab5327fcec162540334"} Dec 01 12:14:27 crc kubenswrapper[4958]: I1201 12:14:27.041589 4958 generic.go:334] "Generic (PLEG): container finished" podID="752669c4-9b31-4db0-9e8b-27252be3fda5" containerID="2a089b3a8d120a7bf3f95eb4d493a3828191054204cc9641cd84d4f7d5f34807" exitCode=0 Dec 01 12:14:27 crc kubenswrapper[4958]: I1201 12:14:27.041698 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" event={"ID":"752669c4-9b31-4db0-9e8b-27252be3fda5","Type":"ContainerDied","Data":"2a089b3a8d120a7bf3f95eb4d493a3828191054204cc9641cd84d4f7d5f34807"} Dec 01 12:14:29 crc kubenswrapper[4958]: I1201 12:14:29.468994 4958 patch_prober.go:28] interesting pod/router-default-5444994796-swd9j container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 12:14:29 crc kubenswrapper[4958]: I1201 12:14:29.469638 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-swd9j" podUID="5a8efcb2-480a-4577-af47-304c07876b28" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 12:14:29 crc kubenswrapper[4958]: I1201 12:14:29.485405 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-dr9h9" podUID="888fe73a-98b3-4b4e-891e-9d75f26c64af" containerName="registry-server" probeResult="failure" output=< Dec 01 12:14:29 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:14:29 crc kubenswrapper[4958]: > Dec 01 12:14:29 crc kubenswrapper[4958]: I1201 12:14:29.503282 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-dr9h9" podUID="888fe73a-98b3-4b4e-891e-9d75f26c64af" containerName="registry-server" probeResult="failure" output=< Dec 01 12:14:29 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:14:29 crc kubenswrapper[4958]: > Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.067431 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.177333 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk8dc\" (UniqueName: \"kubernetes.io/projected/752669c4-9b31-4db0-9e8b-27252be3fda5-kube-api-access-gk8dc\") pod \"752669c4-9b31-4db0-9e8b-27252be3fda5\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.177549 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ceph\") pod \"752669c4-9b31-4db0-9e8b-27252be3fda5\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.177572 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ssh-key\") pod \"752669c4-9b31-4db0-9e8b-27252be3fda5\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.177607 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-inventory\") pod \"752669c4-9b31-4db0-9e8b-27252be3fda5\" (UID: \"752669c4-9b31-4db0-9e8b-27252be3fda5\") " Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.187206 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ceph" (OuterVolumeSpecName: "ceph") pod "752669c4-9b31-4db0-9e8b-27252be3fda5" (UID: "752669c4-9b31-4db0-9e8b-27252be3fda5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.196381 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752669c4-9b31-4db0-9e8b-27252be3fda5-kube-api-access-gk8dc" (OuterVolumeSpecName: "kube-api-access-gk8dc") pod "752669c4-9b31-4db0-9e8b-27252be3fda5" (UID: "752669c4-9b31-4db0-9e8b-27252be3fda5"). InnerVolumeSpecName "kube-api-access-gk8dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.224638 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-inventory" (OuterVolumeSpecName: "inventory") pod "752669c4-9b31-4db0-9e8b-27252be3fda5" (UID: "752669c4-9b31-4db0-9e8b-27252be3fda5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.233182 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "752669c4-9b31-4db0-9e8b-27252be3fda5" (UID: "752669c4-9b31-4db0-9e8b-27252be3fda5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.281187 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.281233 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.281284 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/752669c4-9b31-4db0-9e8b-27252be3fda5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.281298 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk8dc\" (UniqueName: \"kubernetes.io/projected/752669c4-9b31-4db0-9e8b-27252be3fda5-kube-api-access-gk8dc\") on node \"crc\" DevicePath \"\"" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.475340 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" event={"ID":"752669c4-9b31-4db0-9e8b-27252be3fda5","Type":"ContainerDied","Data":"04c3648c92f45794ee56ca4cc65eab3e08e87d0706ae2b9962bcbccd639708bc"} Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.475389 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c3648c92f45794ee56ca4cc65eab3e08e87d0706ae2b9962bcbccd639708bc" Dec 01 12:14:30 crc kubenswrapper[4958]: I1201 12:14:30.475451 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tq6pc" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.183370 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-7gfz2"] Dec 01 12:14:31 crc kubenswrapper[4958]: E1201 12:14:31.184055 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752669c4-9b31-4db0-9e8b-27252be3fda5" containerName="install-os-openstack-openstack-cell1" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.184073 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="752669c4-9b31-4db0-9e8b-27252be3fda5" containerName="install-os-openstack-openstack-cell1" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.184375 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="752669c4-9b31-4db0-9e8b-27252be3fda5" containerName="install-os-openstack-openstack-cell1" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.185130 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.187387 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.187651 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.187867 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.188220 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.194315 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-7gfz2"] Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.303464 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4fp\" (UniqueName: \"kubernetes.io/projected/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-kube-api-access-4h4fp\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.303883 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-inventory\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.304043 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ceph\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.304161 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ssh-key\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.406461 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4fp\" (UniqueName: \"kubernetes.io/projected/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-kube-api-access-4h4fp\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.406964 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-inventory\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.407109 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ceph\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.407202 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ssh-key\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.412080 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-inventory\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.416486 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ceph\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.416636 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ssh-key\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.426669 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4fp\" (UniqueName: \"kubernetes.io/projected/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-kube-api-access-4h4fp\") pod \"configure-os-openstack-openstack-cell1-7gfz2\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:31 crc kubenswrapper[4958]: I1201 12:14:31.554037 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:14:32 crc kubenswrapper[4958]: I1201 12:14:32.139651 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-7gfz2"] Dec 01 12:14:32 crc kubenswrapper[4958]: W1201 12:14:32.142359 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2fa2a04_6d00_4af5_b501_4e3cc1d47544.slice/crio-7e5bf2c914f05a7c2e67ce742003ce2e2e3b40c6a678f53dd0c00c3e5248e0d9 WatchSource:0}: Error finding container 7e5bf2c914f05a7c2e67ce742003ce2e2e3b40c6a678f53dd0c00c3e5248e0d9: Status 404 returned error can't find the container with id 7e5bf2c914f05a7c2e67ce742003ce2e2e3b40c6a678f53dd0c00c3e5248e0d9 Dec 01 12:14:32 crc kubenswrapper[4958]: I1201 12:14:32.147627 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:14:32 crc kubenswrapper[4958]: I1201 12:14:32.501833 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" event={"ID":"d2fa2a04-6d00-4af5-b501-4e3cc1d47544","Type":"ContainerStarted","Data":"7e5bf2c914f05a7c2e67ce742003ce2e2e3b40c6a678f53dd0c00c3e5248e0d9"} Dec 01 12:14:33 crc kubenswrapper[4958]: I1201 12:14:33.517055 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" event={"ID":"d2fa2a04-6d00-4af5-b501-4e3cc1d47544","Type":"ContainerStarted","Data":"8aacde8c1a38d2e5f0745673d61e3c62ae3f19aa5c7a2c5143528c6c193880e6"} Dec 01 12:14:33 crc kubenswrapper[4958]: I1201 12:14:33.541838 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" podStartSLOduration=2.342603009 podStartE2EDuration="2.541816762s" podCreationTimestamp="2025-12-01 12:14:31 +0000 UTC" firstStartedPulling="2025-12-01 12:14:32.14715904 +0000 UTC m=+8119.655948117" lastFinishedPulling="2025-12-01 12:14:32.346372823 +0000 UTC m=+8119.855161870" observedRunningTime="2025-12-01 12:14:33.539820216 +0000 UTC m=+8121.048609253" watchObservedRunningTime="2025-12-01 12:14:33.541816762 +0000 UTC m=+8121.050605819" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.239775 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g"] Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.242120 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.243946 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.244187 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.248881 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g"] Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.402425 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5dr\" (UniqueName: \"kubernetes.io/projected/d0b4f776-f89f-4782-a998-e271d7f657f0-kube-api-access-jv5dr\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.403496 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0b4f776-f89f-4782-a998-e271d7f657f0-secret-volume\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.403589 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0b4f776-f89f-4782-a998-e271d7f657f0-config-volume\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.506070 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0b4f776-f89f-4782-a998-e271d7f657f0-secret-volume\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.506249 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0b4f776-f89f-4782-a998-e271d7f657f0-config-volume\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.506528 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5dr\" (UniqueName: \"kubernetes.io/projected/d0b4f776-f89f-4782-a998-e271d7f657f0-kube-api-access-jv5dr\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.511696 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0b4f776-f89f-4782-a998-e271d7f657f0-config-volume\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.528708 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0b4f776-f89f-4782-a998-e271d7f657f0-secret-volume\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.540144 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5dr\" (UniqueName: \"kubernetes.io/projected/d0b4f776-f89f-4782-a998-e271d7f657f0-kube-api-access-jv5dr\") pod \"collect-profiles-29409855-mhv9g\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:00 crc kubenswrapper[4958]: I1201 12:15:00.569014 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:01 crc kubenswrapper[4958]: I1201 12:15:01.133703 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g"] Dec 01 12:15:01 crc kubenswrapper[4958]: I1201 12:15:01.940879 4958 generic.go:334] "Generic (PLEG): container finished" podID="d0b4f776-f89f-4782-a998-e271d7f657f0" containerID="b0e999efb6d32e294a6209c3f968abcd7e946a7e95a3fefac68948978dba083d" exitCode=0 Dec 01 12:15:01 crc kubenswrapper[4958]: I1201 12:15:01.941158 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" event={"ID":"d0b4f776-f89f-4782-a998-e271d7f657f0","Type":"ContainerDied","Data":"b0e999efb6d32e294a6209c3f968abcd7e946a7e95a3fefac68948978dba083d"} Dec 01 12:15:01 crc kubenswrapper[4958]: I1201 12:15:01.941192 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" event={"ID":"d0b4f776-f89f-4782-a998-e271d7f657f0","Type":"ContainerStarted","Data":"150b5216308bbf3a154ecd2b92be33ca7b01b69ae2f0f2f4f702b8305fcf65dd"} Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.469072 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.475210 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0b4f776-f89f-4782-a998-e271d7f657f0-config-volume\") pod \"d0b4f776-f89f-4782-a998-e271d7f657f0\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.475348 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0b4f776-f89f-4782-a998-e271d7f657f0-secret-volume\") pod \"d0b4f776-f89f-4782-a998-e271d7f657f0\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.475471 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv5dr\" (UniqueName: \"kubernetes.io/projected/d0b4f776-f89f-4782-a998-e271d7f657f0-kube-api-access-jv5dr\") pod \"d0b4f776-f89f-4782-a998-e271d7f657f0\" (UID: \"d0b4f776-f89f-4782-a998-e271d7f657f0\") " Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.477693 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0b4f776-f89f-4782-a998-e271d7f657f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "d0b4f776-f89f-4782-a998-e271d7f657f0" (UID: "d0b4f776-f89f-4782-a998-e271d7f657f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.482578 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b4f776-f89f-4782-a998-e271d7f657f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d0b4f776-f89f-4782-a998-e271d7f657f0" (UID: "d0b4f776-f89f-4782-a998-e271d7f657f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.483682 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b4f776-f89f-4782-a998-e271d7f657f0-kube-api-access-jv5dr" (OuterVolumeSpecName: "kube-api-access-jv5dr") pod "d0b4f776-f89f-4782-a998-e271d7f657f0" (UID: "d0b4f776-f89f-4782-a998-e271d7f657f0"). InnerVolumeSpecName "kube-api-access-jv5dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.578724 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0b4f776-f89f-4782-a998-e271d7f657f0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.579213 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0b4f776-f89f-4782-a998-e271d7f657f0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.579237 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv5dr\" (UniqueName: \"kubernetes.io/projected/d0b4f776-f89f-4782-a998-e271d7f657f0-kube-api-access-jv5dr\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.970323 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" event={"ID":"d0b4f776-f89f-4782-a998-e271d7f657f0","Type":"ContainerDied","Data":"150b5216308bbf3a154ecd2b92be33ca7b01b69ae2f0f2f4f702b8305fcf65dd"} Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.970372 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="150b5216308bbf3a154ecd2b92be33ca7b01b69ae2f0f2f4f702b8305fcf65dd" Dec 01 12:15:03 crc kubenswrapper[4958]: I1201 12:15:03.970424 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g" Dec 01 12:15:04 crc kubenswrapper[4958]: I1201 12:15:04.585827 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf"] Dec 01 12:15:04 crc kubenswrapper[4958]: I1201 12:15:04.596814 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409810-l7nsf"] Dec 01 12:15:05 crc kubenswrapper[4958]: I1201 12:15:05.816452 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ca6d04-613b-4a76-bcfd-bf3e86d635ce" path="/var/lib/kubelet/pods/19ca6d04-613b-4a76-bcfd-bf3e86d635ce/volumes" Dec 01 12:15:23 crc kubenswrapper[4958]: I1201 12:15:23.231611 4958 generic.go:334] "Generic (PLEG): container finished" podID="d2fa2a04-6d00-4af5-b501-4e3cc1d47544" containerID="8aacde8c1a38d2e5f0745673d61e3c62ae3f19aa5c7a2c5143528c6c193880e6" exitCode=0 Dec 01 12:15:23 crc kubenswrapper[4958]: I1201 12:15:23.231682 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" event={"ID":"d2fa2a04-6d00-4af5-b501-4e3cc1d47544","Type":"ContainerDied","Data":"8aacde8c1a38d2e5f0745673d61e3c62ae3f19aa5c7a2c5143528c6c193880e6"} Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.750654 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.795006 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h4fp\" (UniqueName: \"kubernetes.io/projected/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-kube-api-access-4h4fp\") pod \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.795157 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ceph\") pod \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.795201 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ssh-key\") pod \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.795397 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-inventory\") pod \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\" (UID: \"d2fa2a04-6d00-4af5-b501-4e3cc1d47544\") " Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.803620 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ceph" (OuterVolumeSpecName: "ceph") pod "d2fa2a04-6d00-4af5-b501-4e3cc1d47544" (UID: "d2fa2a04-6d00-4af5-b501-4e3cc1d47544"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.804327 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-kube-api-access-4h4fp" (OuterVolumeSpecName: "kube-api-access-4h4fp") pod "d2fa2a04-6d00-4af5-b501-4e3cc1d47544" (UID: "d2fa2a04-6d00-4af5-b501-4e3cc1d47544"). InnerVolumeSpecName "kube-api-access-4h4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.846233 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2fa2a04-6d00-4af5-b501-4e3cc1d47544" (UID: "d2fa2a04-6d00-4af5-b501-4e3cc1d47544"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.859155 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-inventory" (OuterVolumeSpecName: "inventory") pod "d2fa2a04-6d00-4af5-b501-4e3cc1d47544" (UID: "d2fa2a04-6d00-4af5-b501-4e3cc1d47544"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.898970 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h4fp\" (UniqueName: \"kubernetes.io/projected/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-kube-api-access-4h4fp\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.899023 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.899043 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:24 crc kubenswrapper[4958]: I1201 12:15:24.899059 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2fa2a04-6d00-4af5-b501-4e3cc1d47544-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.256691 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" event={"ID":"d2fa2a04-6d00-4af5-b501-4e3cc1d47544","Type":"ContainerDied","Data":"7e5bf2c914f05a7c2e67ce742003ce2e2e3b40c6a678f53dd0c00c3e5248e0d9"} Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.256744 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e5bf2c914f05a7c2e67ce742003ce2e2e3b40c6a678f53dd0c00c3e5248e0d9" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.256763 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-7gfz2" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.357766 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-ddsrh"] Dec 01 12:15:25 crc kubenswrapper[4958]: E1201 12:15:25.358403 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b4f776-f89f-4782-a998-e271d7f657f0" containerName="collect-profiles" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.358425 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b4f776-f89f-4782-a998-e271d7f657f0" containerName="collect-profiles" Dec 01 12:15:25 crc kubenswrapper[4958]: E1201 12:15:25.358470 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fa2a04-6d00-4af5-b501-4e3cc1d47544" containerName="configure-os-openstack-openstack-cell1" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.358484 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fa2a04-6d00-4af5-b501-4e3cc1d47544" containerName="configure-os-openstack-openstack-cell1" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.359000 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2fa2a04-6d00-4af5-b501-4e3cc1d47544" containerName="configure-os-openstack-openstack-cell1" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.359034 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b4f776-f89f-4782-a998-e271d7f657f0" containerName="collect-profiles" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.360000 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.362666 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.363134 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.363393 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.363870 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.371545 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-ddsrh"] Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.409183 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.409343 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8f8\" (UniqueName: \"kubernetes.io/projected/2d92337c-ea91-499a-8876-c9bc25e3bb0d-kube-api-access-5k8f8\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.409405 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ceph\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.409477 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-inventory-0\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.511567 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8f8\" (UniqueName: \"kubernetes.io/projected/2d92337c-ea91-499a-8876-c9bc25e3bb0d-kube-api-access-5k8f8\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.512957 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ceph\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.513160 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-inventory-0\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.513338 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.517315 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-inventory-0\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.518721 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.519235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ceph\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.543871 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8f8\" (UniqueName: \"kubernetes.io/projected/2d92337c-ea91-499a-8876-c9bc25e3bb0d-kube-api-access-5k8f8\") pod \"ssh-known-hosts-openstack-ddsrh\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:25 crc kubenswrapper[4958]: I1201 12:15:25.679683 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:26 crc kubenswrapper[4958]: W1201 12:15:26.276275 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d92337c_ea91_499a_8876_c9bc25e3bb0d.slice/crio-0e344fac18ad320dd48bf601665f0b7b13733dc0d630670ee90624a2b6c6c54d WatchSource:0}: Error finding container 0e344fac18ad320dd48bf601665f0b7b13733dc0d630670ee90624a2b6c6c54d: Status 404 returned error can't find the container with id 0e344fac18ad320dd48bf601665f0b7b13733dc0d630670ee90624a2b6c6c54d Dec 01 12:15:26 crc kubenswrapper[4958]: I1201 12:15:26.284807 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-ddsrh"] Dec 01 12:15:27 crc kubenswrapper[4958]: I1201 12:15:27.287423 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-ddsrh" event={"ID":"2d92337c-ea91-499a-8876-c9bc25e3bb0d","Type":"ContainerStarted","Data":"3b0aaf4fcb928bd6c09b6ddc54c56361c2b483802c6bcc6dc4000fa08bff80c4"} Dec 01 12:15:27 crc kubenswrapper[4958]: I1201 12:15:27.287944 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-ddsrh" event={"ID":"2d92337c-ea91-499a-8876-c9bc25e3bb0d","Type":"ContainerStarted","Data":"0e344fac18ad320dd48bf601665f0b7b13733dc0d630670ee90624a2b6c6c54d"} Dec 01 12:15:27 crc kubenswrapper[4958]: I1201 12:15:27.332772 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-ddsrh" podStartSLOduration=2.138442283 podStartE2EDuration="2.332739988s" podCreationTimestamp="2025-12-01 12:15:25 +0000 UTC" firstStartedPulling="2025-12-01 12:15:26.278886086 +0000 UTC m=+8173.787675133" lastFinishedPulling="2025-12-01 12:15:26.473183771 +0000 UTC m=+8173.981972838" observedRunningTime="2025-12-01 12:15:27.313451374 +0000 UTC m=+8174.822240441" watchObservedRunningTime="2025-12-01 12:15:27.332739988 +0000 UTC m=+8174.841529055" Dec 01 12:15:31 crc kubenswrapper[4958]: I1201 12:15:31.414433 4958 scope.go:117] "RemoveContainer" containerID="32e8cb3d01a8b2bec2d396a42c7b2f3f6c8db35b895ce9ad838796bcc321a8f9" Dec 01 12:15:36 crc kubenswrapper[4958]: I1201 12:15:36.393502 4958 generic.go:334] "Generic (PLEG): container finished" podID="2d92337c-ea91-499a-8876-c9bc25e3bb0d" containerID="3b0aaf4fcb928bd6c09b6ddc54c56361c2b483802c6bcc6dc4000fa08bff80c4" exitCode=0 Dec 01 12:15:36 crc kubenswrapper[4958]: I1201 12:15:36.393629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-ddsrh" event={"ID":"2d92337c-ea91-499a-8876-c9bc25e3bb0d","Type":"ContainerDied","Data":"3b0aaf4fcb928bd6c09b6ddc54c56361c2b483802c6bcc6dc4000fa08bff80c4"} Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.047632 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.222755 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k8f8\" (UniqueName: \"kubernetes.io/projected/2d92337c-ea91-499a-8876-c9bc25e3bb0d-kube-api-access-5k8f8\") pod \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.222823 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ceph\") pod \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.222982 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-inventory-0\") pod \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.223078 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ssh-key-openstack-cell1\") pod \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\" (UID: \"2d92337c-ea91-499a-8876-c9bc25e3bb0d\") " Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.229428 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ceph" (OuterVolumeSpecName: "ceph") pod "2d92337c-ea91-499a-8876-c9bc25e3bb0d" (UID: "2d92337c-ea91-499a-8876-c9bc25e3bb0d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.242262 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d92337c-ea91-499a-8876-c9bc25e3bb0d-kube-api-access-5k8f8" (OuterVolumeSpecName: "kube-api-access-5k8f8") pod "2d92337c-ea91-499a-8876-c9bc25e3bb0d" (UID: "2d92337c-ea91-499a-8876-c9bc25e3bb0d"). InnerVolumeSpecName "kube-api-access-5k8f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.255571 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2d92337c-ea91-499a-8876-c9bc25e3bb0d" (UID: "2d92337c-ea91-499a-8876-c9bc25e3bb0d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.287818 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2d92337c-ea91-499a-8876-c9bc25e3bb0d" (UID: "2d92337c-ea91-499a-8876-c9bc25e3bb0d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.326093 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k8f8\" (UniqueName: \"kubernetes.io/projected/2d92337c-ea91-499a-8876-c9bc25e3bb0d-kube-api-access-5k8f8\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.326458 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.326478 4958 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.326498 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d92337c-ea91-499a-8876-c9bc25e3bb0d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.421159 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-ddsrh" event={"ID":"2d92337c-ea91-499a-8876-c9bc25e3bb0d","Type":"ContainerDied","Data":"0e344fac18ad320dd48bf601665f0b7b13733dc0d630670ee90624a2b6c6c54d"} Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.421221 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-ddsrh" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.421228 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e344fac18ad320dd48bf601665f0b7b13733dc0d630670ee90624a2b6c6c54d" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.548052 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-8ttg8"] Dec 01 12:15:38 crc kubenswrapper[4958]: E1201 12:15:38.548713 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d92337c-ea91-499a-8876-c9bc25e3bb0d" containerName="ssh-known-hosts-openstack" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.548740 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d92337c-ea91-499a-8876-c9bc25e3bb0d" containerName="ssh-known-hosts-openstack" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.549061 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d92337c-ea91-499a-8876-c9bc25e3bb0d" containerName="ssh-known-hosts-openstack" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.550027 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.553082 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.553120 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.553426 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.559938 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-8ttg8"] Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.560732 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.637650 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ssh-key\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.637864 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzt6h\" (UniqueName: \"kubernetes.io/projected/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-kube-api-access-xzt6h\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.638154 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-inventory\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.638260 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ceph\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.740806 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-inventory\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.740919 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ceph\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.741056 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ssh-key\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.741117 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzt6h\" (UniqueName: \"kubernetes.io/projected/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-kube-api-access-xzt6h\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.745873 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ssh-key\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.745967 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-inventory\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.746278 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ceph\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.777924 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzt6h\" (UniqueName: \"kubernetes.io/projected/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-kube-api-access-xzt6h\") pod \"run-os-openstack-openstack-cell1-8ttg8\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:38 crc kubenswrapper[4958]: I1201 12:15:38.877061 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:39 crc kubenswrapper[4958]: W1201 12:15:39.503699 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd3bfb4b_6143_47b7_8415_610c1fbeaeaf.slice/crio-6281a64f8ca946117d8426c3cac5ce171a6080d9e5dab850c6f3f1eaf5bc7418 WatchSource:0}: Error finding container 6281a64f8ca946117d8426c3cac5ce171a6080d9e5dab850c6f3f1eaf5bc7418: Status 404 returned error can't find the container with id 6281a64f8ca946117d8426c3cac5ce171a6080d9e5dab850c6f3f1eaf5bc7418 Dec 01 12:15:39 crc kubenswrapper[4958]: I1201 12:15:39.505053 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-8ttg8"] Dec 01 12:15:40 crc kubenswrapper[4958]: I1201 12:15:40.452579 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" event={"ID":"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf","Type":"ContainerStarted","Data":"692ec28ea07c3a9ba79beadafcdb061f53602e733481a9af21dbb454c1658f54"} Dec 01 12:15:40 crc kubenswrapper[4958]: I1201 12:15:40.452966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" event={"ID":"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf","Type":"ContainerStarted","Data":"6281a64f8ca946117d8426c3cac5ce171a6080d9e5dab850c6f3f1eaf5bc7418"} Dec 01 12:15:40 crc kubenswrapper[4958]: I1201 12:15:40.476911 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" podStartSLOduration=2.2877130279999998 podStartE2EDuration="2.476892868s" podCreationTimestamp="2025-12-01 12:15:38 +0000 UTC" firstStartedPulling="2025-12-01 12:15:39.508206217 +0000 UTC m=+8187.016995264" lastFinishedPulling="2025-12-01 12:15:39.697386067 +0000 UTC m=+8187.206175104" observedRunningTime="2025-12-01 12:15:40.468875562 +0000 UTC m=+8187.977664599" watchObservedRunningTime="2025-12-01 12:15:40.476892868 +0000 UTC m=+8187.985681905" Dec 01 12:15:49 crc kubenswrapper[4958]: I1201 12:15:49.556580 4958 generic.go:334] "Generic (PLEG): container finished" podID="cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" containerID="692ec28ea07c3a9ba79beadafcdb061f53602e733481a9af21dbb454c1658f54" exitCode=0 Dec 01 12:15:49 crc kubenswrapper[4958]: I1201 12:15:49.556732 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" event={"ID":"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf","Type":"ContainerDied","Data":"692ec28ea07c3a9ba79beadafcdb061f53602e733481a9af21dbb454c1658f54"} Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.240813 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.359754 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ceph\") pod \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.359862 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-inventory\") pod \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.359966 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ssh-key\") pod \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.360118 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzt6h\" (UniqueName: \"kubernetes.io/projected/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-kube-api-access-xzt6h\") pod \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\" (UID: \"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf\") " Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.385113 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ceph" (OuterVolumeSpecName: "ceph") pod "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" (UID: "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.387953 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-kube-api-access-xzt6h" (OuterVolumeSpecName: "kube-api-access-xzt6h") pod "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" (UID: "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf"). InnerVolumeSpecName "kube-api-access-xzt6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.415598 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-inventory" (OuterVolumeSpecName: "inventory") pod "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" (UID: "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.432108 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" (UID: "cd3bfb4b-6143-47b7-8415-610c1fbeaeaf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.462740 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.462775 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.462798 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.462808 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzt6h\" (UniqueName: \"kubernetes.io/projected/cd3bfb4b-6143-47b7-8415-610c1fbeaeaf-kube-api-access-xzt6h\") on node \"crc\" DevicePath \"\"" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.582645 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" event={"ID":"cd3bfb4b-6143-47b7-8415-610c1fbeaeaf","Type":"ContainerDied","Data":"6281a64f8ca946117d8426c3cac5ce171a6080d9e5dab850c6f3f1eaf5bc7418"} Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.582677 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-8ttg8" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.582684 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6281a64f8ca946117d8426c3cac5ce171a6080d9e5dab850c6f3f1eaf5bc7418" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.679730 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-zsrwt"] Dec 01 12:15:51 crc kubenswrapper[4958]: E1201 12:15:51.680198 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" containerName="run-os-openstack-openstack-cell1" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.680215 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" containerName="run-os-openstack-openstack-cell1" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.680455 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3bfb4b-6143-47b7-8415-610c1fbeaeaf" containerName="run-os-openstack-openstack-cell1" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.681220 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.684171 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.684342 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.684500 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.685830 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.693131 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-zsrwt"] Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.771395 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-inventory\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.771574 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ceph\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.771596 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.771775 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqzd\" (UniqueName: \"kubernetes.io/projected/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-kube-api-access-5wqzd\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.874982 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-inventory\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.876285 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ceph\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.876431 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.876788 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqzd\" (UniqueName: \"kubernetes.io/projected/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-kube-api-access-5wqzd\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.880955 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.883217 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ceph\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.888792 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-inventory\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:51 crc kubenswrapper[4958]: I1201 12:15:51.901632 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqzd\" (UniqueName: \"kubernetes.io/projected/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-kube-api-access-5wqzd\") pod \"reboot-os-openstack-openstack-cell1-zsrwt\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:52 crc kubenswrapper[4958]: I1201 12:15:52.048075 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:15:52 crc kubenswrapper[4958]: I1201 12:15:52.864589 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-zsrwt"] Dec 01 12:15:52 crc kubenswrapper[4958]: W1201 12:15:52.865085 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b84f1ca_719f_41ac_a6ba_3e88a52a5402.slice/crio-7a519a9adddd82263750af04f24aad34272849d8f6cc0a259830fb58fdabc8b1 WatchSource:0}: Error finding container 7a519a9adddd82263750af04f24aad34272849d8f6cc0a259830fb58fdabc8b1: Status 404 returned error can't find the container with id 7a519a9adddd82263750af04f24aad34272849d8f6cc0a259830fb58fdabc8b1 Dec 01 12:15:53 crc kubenswrapper[4958]: I1201 12:15:53.605166 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" event={"ID":"9b84f1ca-719f-41ac-a6ba-3e88a52a5402","Type":"ContainerStarted","Data":"e593267d554bf6139d46be524ff83f047d72d4cb72c24d7e3e086522b1f818dc"} Dec 01 12:15:53 crc kubenswrapper[4958]: I1201 12:15:53.605753 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" event={"ID":"9b84f1ca-719f-41ac-a6ba-3e88a52a5402","Type":"ContainerStarted","Data":"7a519a9adddd82263750af04f24aad34272849d8f6cc0a259830fb58fdabc8b1"} Dec 01 12:15:53 crc kubenswrapper[4958]: I1201 12:15:53.633805 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" podStartSLOduration=2.427497527 podStartE2EDuration="2.633769388s" podCreationTimestamp="2025-12-01 12:15:51 +0000 UTC" firstStartedPulling="2025-12-01 12:15:52.869391812 +0000 UTC m=+8200.378180889" lastFinishedPulling="2025-12-01 12:15:53.075663673 +0000 UTC m=+8200.584452750" observedRunningTime="2025-12-01 12:15:53.621983726 +0000 UTC m=+8201.130772773" watchObservedRunningTime="2025-12-01 12:15:53.633769388 +0000 UTC m=+8201.142558445" Dec 01 12:16:10 crc kubenswrapper[4958]: I1201 12:16:10.835474 4958 generic.go:334] "Generic (PLEG): container finished" podID="9b84f1ca-719f-41ac-a6ba-3e88a52a5402" containerID="e593267d554bf6139d46be524ff83f047d72d4cb72c24d7e3e086522b1f818dc" exitCode=0 Dec 01 12:16:10 crc kubenswrapper[4958]: I1201 12:16:10.835567 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" event={"ID":"9b84f1ca-719f-41ac-a6ba-3e88a52a5402","Type":"ContainerDied","Data":"e593267d554bf6139d46be524ff83f047d72d4cb72c24d7e3e086522b1f818dc"} Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.483512 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.617649 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-inventory\") pod \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.617880 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ssh-key\") pod \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.617975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqzd\" (UniqueName: \"kubernetes.io/projected/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-kube-api-access-5wqzd\") pod \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.618246 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ceph\") pod \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\" (UID: \"9b84f1ca-719f-41ac-a6ba-3e88a52a5402\") " Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.624242 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ceph" (OuterVolumeSpecName: "ceph") pod "9b84f1ca-719f-41ac-a6ba-3e88a52a5402" (UID: "9b84f1ca-719f-41ac-a6ba-3e88a52a5402"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.626133 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-kube-api-access-5wqzd" (OuterVolumeSpecName: "kube-api-access-5wqzd") pod "9b84f1ca-719f-41ac-a6ba-3e88a52a5402" (UID: "9b84f1ca-719f-41ac-a6ba-3e88a52a5402"). InnerVolumeSpecName "kube-api-access-5wqzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.661217 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-inventory" (OuterVolumeSpecName: "inventory") pod "9b84f1ca-719f-41ac-a6ba-3e88a52a5402" (UID: "9b84f1ca-719f-41ac-a6ba-3e88a52a5402"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.663276 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b84f1ca-719f-41ac-a6ba-3e88a52a5402" (UID: "9b84f1ca-719f-41ac-a6ba-3e88a52a5402"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.722440 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.722481 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.722496 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqzd\" (UniqueName: \"kubernetes.io/projected/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-kube-api-access-5wqzd\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.722509 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b84f1ca-719f-41ac-a6ba-3e88a52a5402-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.863300 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" event={"ID":"9b84f1ca-719f-41ac-a6ba-3e88a52a5402","Type":"ContainerDied","Data":"7a519a9adddd82263750af04f24aad34272849d8f6cc0a259830fb58fdabc8b1"} Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.863386 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a519a9adddd82263750af04f24aad34272849d8f6cc0a259830fb58fdabc8b1" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.863476 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-zsrwt" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.957493 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rjxb4"] Dec 01 12:16:12 crc kubenswrapper[4958]: E1201 12:16:12.958152 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84f1ca-719f-41ac-a6ba-3e88a52a5402" containerName="reboot-os-openstack-openstack-cell1" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.958176 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84f1ca-719f-41ac-a6ba-3e88a52a5402" containerName="reboot-os-openstack-openstack-cell1" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.958379 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84f1ca-719f-41ac-a6ba-3e88a52a5402" containerName="reboot-os-openstack-openstack-cell1" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.959245 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.960962 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.965322 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.965356 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.965493 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:16:12 crc kubenswrapper[4958]: I1201 12:16:12.967704 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rjxb4"] Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.131781 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ssh-key\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132053 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132333 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132415 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4zr\" (UniqueName: \"kubernetes.io/projected/13460e8d-e2a5-48ef-8d39-da6e900da0e5-kube-api-access-9x4zr\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132516 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132762 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-inventory\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.132969 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.133032 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.133071 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ceph\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.133109 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.133228 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.235683 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.235740 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4zr\" (UniqueName: \"kubernetes.io/projected/13460e8d-e2a5-48ef-8d39-da6e900da0e5-kube-api-access-9x4zr\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.235773 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.235836 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.235880 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-inventory\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236403 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236433 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236533 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ceph\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236564 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236602 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236626 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ssh-key\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.236706 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.245730 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ceph\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.245829 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-inventory\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.245930 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.246496 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.246862 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.247479 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.247508 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ssh-key\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.253444 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.253959 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.256037 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.256674 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.260032 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4zr\" (UniqueName: \"kubernetes.io/projected/13460e8d-e2a5-48ef-8d39-da6e900da0e5-kube-api-access-9x4zr\") pod \"install-certs-openstack-openstack-cell1-rjxb4\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.287648 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:13 crc kubenswrapper[4958]: I1201 12:16:13.990769 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rjxb4"] Dec 01 12:16:14 crc kubenswrapper[4958]: I1201 12:16:14.148425 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:16:14 crc kubenswrapper[4958]: I1201 12:16:14.891275 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" event={"ID":"13460e8d-e2a5-48ef-8d39-da6e900da0e5","Type":"ContainerStarted","Data":"b3e261233c85a74b9e2d3ca190d307df971eb80ac5a785e8b4d8f07bcc3d926e"} Dec 01 12:16:14 crc kubenswrapper[4958]: I1201 12:16:14.891574 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" event={"ID":"13460e8d-e2a5-48ef-8d39-da6e900da0e5","Type":"ContainerStarted","Data":"5fa79f67ab6783e94985d302a301acf00113154738b0c7608dff32eff057b7fb"} Dec 01 12:16:14 crc kubenswrapper[4958]: I1201 12:16:14.914418 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" podStartSLOduration=2.767316662 podStartE2EDuration="2.914395146s" podCreationTimestamp="2025-12-01 12:16:12 +0000 UTC" firstStartedPulling="2025-12-01 12:16:13.99881669 +0000 UTC m=+8221.507605757" lastFinishedPulling="2025-12-01 12:16:14.145895204 +0000 UTC m=+8221.654684241" observedRunningTime="2025-12-01 12:16:14.908813399 +0000 UTC m=+8222.417602446" watchObservedRunningTime="2025-12-01 12:16:14.914395146 +0000 UTC m=+8222.423184193" Dec 01 12:16:28 crc kubenswrapper[4958]: I1201 12:16:28.210531 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:16:28 crc kubenswrapper[4958]: I1201 12:16:28.211248 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:16:36 crc kubenswrapper[4958]: I1201 12:16:36.226268 4958 generic.go:334] "Generic (PLEG): container finished" podID="13460e8d-e2a5-48ef-8d39-da6e900da0e5" containerID="b3e261233c85a74b9e2d3ca190d307df971eb80ac5a785e8b4d8f07bcc3d926e" exitCode=0 Dec 01 12:16:36 crc kubenswrapper[4958]: I1201 12:16:36.226376 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" event={"ID":"13460e8d-e2a5-48ef-8d39-da6e900da0e5","Type":"ContainerDied","Data":"b3e261233c85a74b9e2d3ca190d307df971eb80ac5a785e8b4d8f07bcc3d926e"} Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.799878 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858258 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-bootstrap-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858420 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ceph\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858476 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-dhcp-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858519 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-sriov-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858673 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-libvirt-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858772 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-metadata-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858833 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ssh-key\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858906 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x4zr\" (UniqueName: \"kubernetes.io/projected/13460e8d-e2a5-48ef-8d39-da6e900da0e5-kube-api-access-9x4zr\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.858981 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-telemetry-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.859110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ovn-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.859163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-inventory\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.859218 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-nova-combined-ca-bundle\") pod \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\" (UID: \"13460e8d-e2a5-48ef-8d39-da6e900da0e5\") " Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.879497 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.879551 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ceph" (OuterVolumeSpecName: "ceph") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.879601 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.887299 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.890965 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.893914 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13460e8d-e2a5-48ef-8d39-da6e900da0e5-kube-api-access-9x4zr" (OuterVolumeSpecName: "kube-api-access-9x4zr") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "kube-api-access-9x4zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.894348 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.894450 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.895435 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.908022 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.927654 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.950001 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-inventory" (OuterVolumeSpecName: "inventory") pod "13460e8d-e2a5-48ef-8d39-da6e900da0e5" (UID: "13460e8d-e2a5-48ef-8d39-da6e900da0e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962732 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962767 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x4zr\" (UniqueName: \"kubernetes.io/projected/13460e8d-e2a5-48ef-8d39-da6e900da0e5-kube-api-access-9x4zr\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962781 4958 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962791 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962803 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962812 4958 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962820 4958 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962828 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962838 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962859 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962868 4958 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:37 crc kubenswrapper[4958]: I1201 12:16:37.962878 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13460e8d-e2a5-48ef-8d39-da6e900da0e5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.248491 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" event={"ID":"13460e8d-e2a5-48ef-8d39-da6e900da0e5","Type":"ContainerDied","Data":"5fa79f67ab6783e94985d302a301acf00113154738b0c7608dff32eff057b7fb"} Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.248537 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa79f67ab6783e94985d302a301acf00113154738b0c7608dff32eff057b7fb" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.248595 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rjxb4" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.372592 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zmmvc"] Dec 01 12:16:38 crc kubenswrapper[4958]: E1201 12:16:38.373458 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13460e8d-e2a5-48ef-8d39-da6e900da0e5" containerName="install-certs-openstack-openstack-cell1" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.373574 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="13460e8d-e2a5-48ef-8d39-da6e900da0e5" containerName="install-certs-openstack-openstack-cell1" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.373953 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="13460e8d-e2a5-48ef-8d39-da6e900da0e5" containerName="install-certs-openstack-openstack-cell1" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.375409 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.378625 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.379138 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.379334 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.379507 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.407649 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zmmvc"] Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.477079 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ceph\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.477830 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-inventory\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.477969 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.478800 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5tf\" (UniqueName: \"kubernetes.io/projected/2cd94035-7ae3-4f86-995f-45f75856c2bd-kube-api-access-fl5tf\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.581486 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-inventory\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.581596 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.581722 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5tf\" (UniqueName: \"kubernetes.io/projected/2cd94035-7ae3-4f86-995f-45f75856c2bd-kube-api-access-fl5tf\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.581833 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ceph\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.589949 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-inventory\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.595189 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.595242 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ceph\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.610981 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5tf\" (UniqueName: \"kubernetes.io/projected/2cd94035-7ae3-4f86-995f-45f75856c2bd-kube-api-access-fl5tf\") pod \"ceph-client-openstack-openstack-cell1-zmmvc\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:38 crc kubenswrapper[4958]: I1201 12:16:38.705670 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:39 crc kubenswrapper[4958]: I1201 12:16:39.361530 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zmmvc"] Dec 01 12:16:40 crc kubenswrapper[4958]: I1201 12:16:40.269833 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" event={"ID":"2cd94035-7ae3-4f86-995f-45f75856c2bd","Type":"ContainerStarted","Data":"89f2607c19bf0c0b5474c8fab0e713ed0264e99e42969a559054d960ebe5300b"} Dec 01 12:16:40 crc kubenswrapper[4958]: I1201 12:16:40.270451 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" event={"ID":"2cd94035-7ae3-4f86-995f-45f75856c2bd","Type":"ContainerStarted","Data":"0dffef24dab765c707cfa73a02cdcebd05e4bae5fdef77865893941d631bd89e"} Dec 01 12:16:40 crc kubenswrapper[4958]: I1201 12:16:40.291497 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" podStartSLOduration=2.029507764 podStartE2EDuration="2.291476895s" podCreationTimestamp="2025-12-01 12:16:38 +0000 UTC" firstStartedPulling="2025-12-01 12:16:39.370491917 +0000 UTC m=+8246.879280964" lastFinishedPulling="2025-12-01 12:16:39.632461018 +0000 UTC m=+8247.141250095" observedRunningTime="2025-12-01 12:16:40.288634515 +0000 UTC m=+8247.797423572" watchObservedRunningTime="2025-12-01 12:16:40.291476895 +0000 UTC m=+8247.800265932" Dec 01 12:16:46 crc kubenswrapper[4958]: I1201 12:16:46.346157 4958 generic.go:334] "Generic (PLEG): container finished" podID="2cd94035-7ae3-4f86-995f-45f75856c2bd" containerID="89f2607c19bf0c0b5474c8fab0e713ed0264e99e42969a559054d960ebe5300b" exitCode=0 Dec 01 12:16:46 crc kubenswrapper[4958]: I1201 12:16:46.346243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" event={"ID":"2cd94035-7ae3-4f86-995f-45f75856c2bd","Type":"ContainerDied","Data":"89f2607c19bf0c0b5474c8fab0e713ed0264e99e42969a559054d960ebe5300b"} Dec 01 12:16:47 crc kubenswrapper[4958]: I1201 12:16:47.992775 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.044636 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ssh-key\") pod \"2cd94035-7ae3-4f86-995f-45f75856c2bd\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.044931 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-inventory\") pod \"2cd94035-7ae3-4f86-995f-45f75856c2bd\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.045012 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl5tf\" (UniqueName: \"kubernetes.io/projected/2cd94035-7ae3-4f86-995f-45f75856c2bd-kube-api-access-fl5tf\") pod \"2cd94035-7ae3-4f86-995f-45f75856c2bd\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.045056 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ceph\") pod \"2cd94035-7ae3-4f86-995f-45f75856c2bd\" (UID: \"2cd94035-7ae3-4f86-995f-45f75856c2bd\") " Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.053002 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ceph" (OuterVolumeSpecName: "ceph") pod "2cd94035-7ae3-4f86-995f-45f75856c2bd" (UID: "2cd94035-7ae3-4f86-995f-45f75856c2bd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.063178 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd94035-7ae3-4f86-995f-45f75856c2bd-kube-api-access-fl5tf" (OuterVolumeSpecName: "kube-api-access-fl5tf") pod "2cd94035-7ae3-4f86-995f-45f75856c2bd" (UID: "2cd94035-7ae3-4f86-995f-45f75856c2bd"). InnerVolumeSpecName "kube-api-access-fl5tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.087330 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-inventory" (OuterVolumeSpecName: "inventory") pod "2cd94035-7ae3-4f86-995f-45f75856c2bd" (UID: "2cd94035-7ae3-4f86-995f-45f75856c2bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.123311 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cd94035-7ae3-4f86-995f-45f75856c2bd" (UID: "2cd94035-7ae3-4f86-995f-45f75856c2bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.148142 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl5tf\" (UniqueName: \"kubernetes.io/projected/2cd94035-7ae3-4f86-995f-45f75856c2bd-kube-api-access-fl5tf\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.148185 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.148196 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.148204 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd94035-7ae3-4f86-995f-45f75856c2bd-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.378325 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" event={"ID":"2cd94035-7ae3-4f86-995f-45f75856c2bd","Type":"ContainerDied","Data":"0dffef24dab765c707cfa73a02cdcebd05e4bae5fdef77865893941d631bd89e"} Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.378405 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dffef24dab765c707cfa73a02cdcebd05e4bae5fdef77865893941d631bd89e" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.378826 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zmmvc" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.473885 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-s5btl"] Dec 01 12:16:48 crc kubenswrapper[4958]: E1201 12:16:48.474344 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd94035-7ae3-4f86-995f-45f75856c2bd" containerName="ceph-client-openstack-openstack-cell1" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.474363 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd94035-7ae3-4f86-995f-45f75856c2bd" containerName="ceph-client-openstack-openstack-cell1" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.474608 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd94035-7ae3-4f86-995f-45f75856c2bd" containerName="ceph-client-openstack-openstack-cell1" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.475609 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.480135 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.482772 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.482796 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.483029 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.483040 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.508747 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-s5btl"] Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.560490 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ceph\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.560571 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ssh-key\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.560732 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4htnz\" (UniqueName: \"kubernetes.io/projected/b4388057-4ba6-4c0b-a333-e9e95768cea6-kube-api-access-4htnz\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.560766 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.560952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-inventory\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.561052 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.663131 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-inventory\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.663217 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.664122 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ceph\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.664369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ssh-key\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.664643 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4htnz\" (UniqueName: \"kubernetes.io/projected/b4388057-4ba6-4c0b-a333-e9e95768cea6-kube-api-access-4htnz\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.664750 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.666341 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.670665 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ceph\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.674038 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.674220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-inventory\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.681764 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ssh-key\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.701215 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4htnz\" (UniqueName: \"kubernetes.io/projected/b4388057-4ba6-4c0b-a333-e9e95768cea6-kube-api-access-4htnz\") pod \"ovn-openstack-openstack-cell1-s5btl\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:48 crc kubenswrapper[4958]: I1201 12:16:48.799293 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:16:49 crc kubenswrapper[4958]: I1201 12:16:49.480534 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-s5btl"] Dec 01 12:16:49 crc kubenswrapper[4958]: W1201 12:16:49.493314 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4388057_4ba6_4c0b_a333_e9e95768cea6.slice/crio-e82d8adafb621e66089f0c6dc56e3fc18cb6beb99149b50f77a0a878bd7fb728 WatchSource:0}: Error finding container e82d8adafb621e66089f0c6dc56e3fc18cb6beb99149b50f77a0a878bd7fb728: Status 404 returned error can't find the container with id e82d8adafb621e66089f0c6dc56e3fc18cb6beb99149b50f77a0a878bd7fb728 Dec 01 12:16:50 crc kubenswrapper[4958]: I1201 12:16:50.402855 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-s5btl" event={"ID":"b4388057-4ba6-4c0b-a333-e9e95768cea6","Type":"ContainerStarted","Data":"342ea250f203d6437158f6d65f88d69ba7de30aeff168d06a2418b6b95a4eb20"} Dec 01 12:16:50 crc kubenswrapper[4958]: I1201 12:16:50.402899 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-s5btl" event={"ID":"b4388057-4ba6-4c0b-a333-e9e95768cea6","Type":"ContainerStarted","Data":"e82d8adafb621e66089f0c6dc56e3fc18cb6beb99149b50f77a0a878bd7fb728"} Dec 01 12:16:50 crc kubenswrapper[4958]: I1201 12:16:50.434759 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-s5btl" podStartSLOduration=2.24736944 podStartE2EDuration="2.434739339s" podCreationTimestamp="2025-12-01 12:16:48 +0000 UTC" firstStartedPulling="2025-12-01 12:16:49.496918156 +0000 UTC m=+8257.005707203" lastFinishedPulling="2025-12-01 12:16:49.684288065 +0000 UTC m=+8257.193077102" observedRunningTime="2025-12-01 12:16:50.430692725 +0000 UTC m=+8257.939481802" watchObservedRunningTime="2025-12-01 12:16:50.434739339 +0000 UTC m=+8257.943528376" Dec 01 12:16:58 crc kubenswrapper[4958]: I1201 12:16:58.210427 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:16:58 crc kubenswrapper[4958]: I1201 12:16:58.210972 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:17:28 crc kubenswrapper[4958]: I1201 12:17:28.210340 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:17:28 crc kubenswrapper[4958]: I1201 12:17:28.212124 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:17:28 crc kubenswrapper[4958]: I1201 12:17:28.212213 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:17:28 crc kubenswrapper[4958]: I1201 12:17:28.213376 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acaaa06729a7596508194b634486b60d2d4a6b7ee287eab5327fcec162540334"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:17:28 crc kubenswrapper[4958]: I1201 12:17:28.213447 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://acaaa06729a7596508194b634486b60d2d4a6b7ee287eab5327fcec162540334" gracePeriod=600 Dec 01 12:17:29 crc kubenswrapper[4958]: I1201 12:17:29.020368 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="acaaa06729a7596508194b634486b60d2d4a6b7ee287eab5327fcec162540334" exitCode=0 Dec 01 12:17:29 crc kubenswrapper[4958]: I1201 12:17:29.020444 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"acaaa06729a7596508194b634486b60d2d4a6b7ee287eab5327fcec162540334"} Dec 01 12:17:29 crc kubenswrapper[4958]: I1201 12:17:29.021498 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3"} Dec 01 12:17:29 crc kubenswrapper[4958]: I1201 12:17:29.021538 4958 scope.go:117] "RemoveContainer" containerID="7b9619cb8603de93d5fb446aed8911986450bf3d8058e350ec5a4e927fa87c46" Dec 01 12:18:04 crc kubenswrapper[4958]: I1201 12:18:04.497603 4958 generic.go:334] "Generic (PLEG): container finished" podID="b4388057-4ba6-4c0b-a333-e9e95768cea6" containerID="342ea250f203d6437158f6d65f88d69ba7de30aeff168d06a2418b6b95a4eb20" exitCode=0 Dec 01 12:18:04 crc kubenswrapper[4958]: I1201 12:18:04.498409 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-s5btl" event={"ID":"b4388057-4ba6-4c0b-a333-e9e95768cea6","Type":"ContainerDied","Data":"342ea250f203d6437158f6d65f88d69ba7de30aeff168d06a2418b6b95a4eb20"} Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.026349 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.130290 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovn-combined-ca-bundle\") pod \"b4388057-4ba6-4c0b-a333-e9e95768cea6\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.130653 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ssh-key\") pod \"b4388057-4ba6-4c0b-a333-e9e95768cea6\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.130881 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4htnz\" (UniqueName: \"kubernetes.io/projected/b4388057-4ba6-4c0b-a333-e9e95768cea6-kube-api-access-4htnz\") pod \"b4388057-4ba6-4c0b-a333-e9e95768cea6\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.133074 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-inventory\") pod \"b4388057-4ba6-4c0b-a333-e9e95768cea6\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.133157 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ceph\") pod \"b4388057-4ba6-4c0b-a333-e9e95768cea6\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.133243 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovncontroller-config-0\") pod \"b4388057-4ba6-4c0b-a333-e9e95768cea6\" (UID: \"b4388057-4ba6-4c0b-a333-e9e95768cea6\") " Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.138817 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ceph" (OuterVolumeSpecName: "ceph") pod "b4388057-4ba6-4c0b-a333-e9e95768cea6" (UID: "b4388057-4ba6-4c0b-a333-e9e95768cea6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.140561 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b4388057-4ba6-4c0b-a333-e9e95768cea6" (UID: "b4388057-4ba6-4c0b-a333-e9e95768cea6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.156930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4388057-4ba6-4c0b-a333-e9e95768cea6-kube-api-access-4htnz" (OuterVolumeSpecName: "kube-api-access-4htnz") pod "b4388057-4ba6-4c0b-a333-e9e95768cea6" (UID: "b4388057-4ba6-4c0b-a333-e9e95768cea6"). InnerVolumeSpecName "kube-api-access-4htnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.167760 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b4388057-4ba6-4c0b-a333-e9e95768cea6" (UID: "b4388057-4ba6-4c0b-a333-e9e95768cea6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.187639 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b4388057-4ba6-4c0b-a333-e9e95768cea6" (UID: "b4388057-4ba6-4c0b-a333-e9e95768cea6"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.197687 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-inventory" (OuterVolumeSpecName: "inventory") pod "b4388057-4ba6-4c0b-a333-e9e95768cea6" (UID: "b4388057-4ba6-4c0b-a333-e9e95768cea6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.235800 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4htnz\" (UniqueName: \"kubernetes.io/projected/b4388057-4ba6-4c0b-a333-e9e95768cea6-kube-api-access-4htnz\") on node \"crc\" DevicePath \"\"" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.235836 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.235859 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.235867 4958 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.235898 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.235907 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b4388057-4ba6-4c0b-a333-e9e95768cea6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.523819 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-s5btl" event={"ID":"b4388057-4ba6-4c0b-a333-e9e95768cea6","Type":"ContainerDied","Data":"e82d8adafb621e66089f0c6dc56e3fc18cb6beb99149b50f77a0a878bd7fb728"} Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.523873 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e82d8adafb621e66089f0c6dc56e3fc18cb6beb99149b50f77a0a878bd7fb728" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.523904 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-s5btl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.673604 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-9g5fl"] Dec 01 12:18:06 crc kubenswrapper[4958]: E1201 12:18:06.674207 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4388057-4ba6-4c0b-a333-e9e95768cea6" containerName="ovn-openstack-openstack-cell1" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.674231 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4388057-4ba6-4c0b-a333-e9e95768cea6" containerName="ovn-openstack-openstack-cell1" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.674560 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4388057-4ba6-4c0b-a333-e9e95768cea6" containerName="ovn-openstack-openstack-cell1" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.675604 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.678236 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.680068 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.681175 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.682344 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.685254 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.698750 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-9g5fl"] Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.712936 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848113 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76w2w\" (UniqueName: \"kubernetes.io/projected/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-kube-api-access-76w2w\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848162 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848194 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848455 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848503 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848607 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.848643 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951041 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951089 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951137 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951157 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951299 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76w2w\" (UniqueName: \"kubernetes.io/projected/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-kube-api-access-76w2w\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951325 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.951356 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.955610 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.956811 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.959411 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.959647 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.960310 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.967478 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:06 crc kubenswrapper[4958]: I1201 12:18:06.968116 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76w2w\" (UniqueName: \"kubernetes.io/projected/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-kube-api-access-76w2w\") pod \"neutron-metadata-openstack-openstack-cell1-9g5fl\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:07 crc kubenswrapper[4958]: I1201 12:18:07.005181 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:18:07 crc kubenswrapper[4958]: I1201 12:18:07.438981 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-9g5fl"] Dec 01 12:18:07 crc kubenswrapper[4958]: I1201 12:18:07.534384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" event={"ID":"a69fcd96-bb79-447e-aafb-40a30d9c0f1c","Type":"ContainerStarted","Data":"4a155eadab300a961aff1273e4b4355efaf948701d2466a8f58391503f17ce93"} Dec 01 12:18:08 crc kubenswrapper[4958]: I1201 12:18:08.552262 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" event={"ID":"a69fcd96-bb79-447e-aafb-40a30d9c0f1c","Type":"ContainerStarted","Data":"4af2d6e1f67bcbd0725f438fbbf57b580f22d31072f5f6f0c27efa715aa60dc5"} Dec 01 12:18:08 crc kubenswrapper[4958]: I1201 12:18:08.617964 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" podStartSLOduration=2.451386294 podStartE2EDuration="2.617930826s" podCreationTimestamp="2025-12-01 12:18:06 +0000 UTC" firstStartedPulling="2025-12-01 12:18:07.428620789 +0000 UTC m=+8334.937409826" lastFinishedPulling="2025-12-01 12:18:07.595165321 +0000 UTC m=+8335.103954358" observedRunningTime="2025-12-01 12:18:08.592022146 +0000 UTC m=+8336.100811193" watchObservedRunningTime="2025-12-01 12:18:08.617930826 +0000 UTC m=+8336.126719873" Dec 01 12:19:07 crc kubenswrapper[4958]: I1201 12:19:07.421812 4958 generic.go:334] "Generic (PLEG): container finished" podID="a69fcd96-bb79-447e-aafb-40a30d9c0f1c" containerID="4af2d6e1f67bcbd0725f438fbbf57b580f22d31072f5f6f0c27efa715aa60dc5" exitCode=0 Dec 01 12:19:07 crc kubenswrapper[4958]: I1201 12:19:07.421888 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" event={"ID":"a69fcd96-bb79-447e-aafb-40a30d9c0f1c","Type":"ContainerDied","Data":"4af2d6e1f67bcbd0725f438fbbf57b580f22d31072f5f6f0c27efa715aa60dc5"} Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.139409 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.277043 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.277440 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-metadata-combined-ca-bundle\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.277747 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-nova-metadata-neutron-config-0\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.277962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-inventory\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.278230 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76w2w\" (UniqueName: \"kubernetes.io/projected/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-kube-api-access-76w2w\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.278385 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ceph\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.278589 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ssh-key\") pod \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\" (UID: \"a69fcd96-bb79-447e-aafb-40a30d9c0f1c\") " Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.283107 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.285666 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ceph" (OuterVolumeSpecName: "ceph") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.290256 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-kube-api-access-76w2w" (OuterVolumeSpecName: "kube-api-access-76w2w") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "kube-api-access-76w2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.320422 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.327656 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-inventory" (OuterVolumeSpecName: "inventory") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.331930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.337011 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a69fcd96-bb79-447e-aafb-40a30d9c0f1c" (UID: "a69fcd96-bb79-447e-aafb-40a30d9c0f1c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381628 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381673 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381688 4958 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381702 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381713 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76w2w\" (UniqueName: \"kubernetes.io/projected/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-kube-api-access-76w2w\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381730 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.381740 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a69fcd96-bb79-447e-aafb-40a30d9c0f1c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.453776 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" event={"ID":"a69fcd96-bb79-447e-aafb-40a30d9c0f1c","Type":"ContainerDied","Data":"4a155eadab300a961aff1273e4b4355efaf948701d2466a8f58391503f17ce93"} Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.453829 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a155eadab300a961aff1273e4b4355efaf948701d2466a8f58391503f17ce93" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.453825 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-9g5fl" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.582227 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-lp72z"] Dec 01 12:19:09 crc kubenswrapper[4958]: E1201 12:19:09.587619 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69fcd96-bb79-447e-aafb-40a30d9c0f1c" containerName="neutron-metadata-openstack-openstack-cell1" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.587680 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69fcd96-bb79-447e-aafb-40a30d9c0f1c" containerName="neutron-metadata-openstack-openstack-cell1" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.588282 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69fcd96-bb79-447e-aafb-40a30d9c0f1c" containerName="neutron-metadata-openstack-openstack-cell1" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.589802 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.595285 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.595915 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.596161 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.596681 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.598794 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.617541 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-lp72z"] Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.691795 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.692319 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-inventory\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.692419 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ceph\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.692700 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.692981 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ssh-key\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.693176 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cdjc\" (UniqueName: \"kubernetes.io/projected/5411c83e-afd4-4c5b-a751-13fb79850c06-kube-api-access-9cdjc\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.795901 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdjc\" (UniqueName: \"kubernetes.io/projected/5411c83e-afd4-4c5b-a751-13fb79850c06-kube-api-access-9cdjc\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.795986 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.796110 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-inventory\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.796141 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ceph\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.796184 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.796241 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ssh-key\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.806434 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.806649 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ssh-key\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.807833 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.810973 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ceph\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.814029 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-inventory\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.834831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cdjc\" (UniqueName: \"kubernetes.io/projected/5411c83e-afd4-4c5b-a751-13fb79850c06-kube-api-access-9cdjc\") pod \"libvirt-openstack-openstack-cell1-lp72z\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:09 crc kubenswrapper[4958]: I1201 12:19:09.912262 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.235796 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnml"] Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.238599 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.254928 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnml"] Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.413886 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgsfj\" (UniqueName: \"kubernetes.io/projected/1ad97a23-6e23-41a1-822a-15509f751031-kube-api-access-rgsfj\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.414266 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-catalog-content\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.414301 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-utilities\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.517447 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-catalog-content\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.517574 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-utilities\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.517944 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgsfj\" (UniqueName: \"kubernetes.io/projected/1ad97a23-6e23-41a1-822a-15509f751031-kube-api-access-rgsfj\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.518144 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-catalog-content\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.518243 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-utilities\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.539498 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgsfj\" (UniqueName: \"kubernetes.io/projected/1ad97a23-6e23-41a1-822a-15509f751031-kube-api-access-rgsfj\") pod \"redhat-marketplace-kqnml\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.567953 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:10 crc kubenswrapper[4958]: I1201 12:19:10.653185 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-lp72z"] Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.109295 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnml"] Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.486906 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" event={"ID":"5411c83e-afd4-4c5b-a751-13fb79850c06","Type":"ContainerStarted","Data":"e1a0e12fac926bed38f29b07967333cf64b07a52c93c95b526e00bb5c5717bdf"} Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.488220 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" event={"ID":"5411c83e-afd4-4c5b-a751-13fb79850c06","Type":"ContainerStarted","Data":"8f9db40d71ef625372ddb4afc7a7e5e2e5f05c0914938b59650f8373086cb0cb"} Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.507554 4958 generic.go:334] "Generic (PLEG): container finished" podID="1ad97a23-6e23-41a1-822a-15509f751031" containerID="d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2" exitCode=0 Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.507606 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnml" event={"ID":"1ad97a23-6e23-41a1-822a-15509f751031","Type":"ContainerDied","Data":"d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2"} Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.507638 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnml" event={"ID":"1ad97a23-6e23-41a1-822a-15509f751031","Type":"ContainerStarted","Data":"3bff83a3618b39d8744d99450d8d4a9378209fcfa5823bf459ce072c81d1ee42"} Dec 01 12:19:11 crc kubenswrapper[4958]: I1201 12:19:11.683400 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" podStartSLOduration=2.429849657 podStartE2EDuration="2.68337622s" podCreationTimestamp="2025-12-01 12:19:09 +0000 UTC" firstStartedPulling="2025-12-01 12:19:10.680256199 +0000 UTC m=+8398.189045236" lastFinishedPulling="2025-12-01 12:19:10.933782762 +0000 UTC m=+8398.442571799" observedRunningTime="2025-12-01 12:19:11.514873924 +0000 UTC m=+8399.023663291" watchObservedRunningTime="2025-12-01 12:19:11.68337622 +0000 UTC m=+8399.192165257" Dec 01 12:19:13 crc kubenswrapper[4958]: I1201 12:19:13.537492 4958 generic.go:334] "Generic (PLEG): container finished" podID="1ad97a23-6e23-41a1-822a-15509f751031" containerID="7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7" exitCode=0 Dec 01 12:19:13 crc kubenswrapper[4958]: I1201 12:19:13.537587 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnml" event={"ID":"1ad97a23-6e23-41a1-822a-15509f751031","Type":"ContainerDied","Data":"7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7"} Dec 01 12:19:14 crc kubenswrapper[4958]: I1201 12:19:14.552762 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnml" event={"ID":"1ad97a23-6e23-41a1-822a-15509f751031","Type":"ContainerStarted","Data":"a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6"} Dec 01 12:19:14 crc kubenswrapper[4958]: I1201 12:19:14.603048 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqnml" podStartSLOduration=2.050395541 podStartE2EDuration="4.603024938s" podCreationTimestamp="2025-12-01 12:19:10 +0000 UTC" firstStartedPulling="2025-12-01 12:19:11.510197312 +0000 UTC m=+8399.018986379" lastFinishedPulling="2025-12-01 12:19:14.062826749 +0000 UTC m=+8401.571615776" observedRunningTime="2025-12-01 12:19:14.584924798 +0000 UTC m=+8402.093713875" watchObservedRunningTime="2025-12-01 12:19:14.603024938 +0000 UTC m=+8402.111813975" Dec 01 12:19:20 crc kubenswrapper[4958]: I1201 12:19:20.568964 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:20 crc kubenswrapper[4958]: I1201 12:19:20.569697 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:20 crc kubenswrapper[4958]: I1201 12:19:20.673762 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:20 crc kubenswrapper[4958]: I1201 12:19:20.737356 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:20 crc kubenswrapper[4958]: I1201 12:19:20.921489 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnml"] Dec 01 12:19:22 crc kubenswrapper[4958]: I1201 12:19:22.667735 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqnml" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="registry-server" containerID="cri-o://a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6" gracePeriod=2 Dec 01 12:19:22 crc kubenswrapper[4958]: E1201 12:19:22.980672 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad97a23_6e23_41a1_822a_15509f751031.slice/crio-conmon-a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6.scope\": RecentStats: unable to find data in memory cache]" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.254542 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.421554 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-catalog-content\") pod \"1ad97a23-6e23-41a1-822a-15509f751031\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.422214 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-utilities\") pod \"1ad97a23-6e23-41a1-822a-15509f751031\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.422404 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgsfj\" (UniqueName: \"kubernetes.io/projected/1ad97a23-6e23-41a1-822a-15509f751031-kube-api-access-rgsfj\") pod \"1ad97a23-6e23-41a1-822a-15509f751031\" (UID: \"1ad97a23-6e23-41a1-822a-15509f751031\") " Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.423038 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-utilities" (OuterVolumeSpecName: "utilities") pod "1ad97a23-6e23-41a1-822a-15509f751031" (UID: "1ad97a23-6e23-41a1-822a-15509f751031"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.430935 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad97a23-6e23-41a1-822a-15509f751031-kube-api-access-rgsfj" (OuterVolumeSpecName: "kube-api-access-rgsfj") pod "1ad97a23-6e23-41a1-822a-15509f751031" (UID: "1ad97a23-6e23-41a1-822a-15509f751031"). InnerVolumeSpecName "kube-api-access-rgsfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.438663 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ad97a23-6e23-41a1-822a-15509f751031" (UID: "1ad97a23-6e23-41a1-822a-15509f751031"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.525721 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.525786 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgsfj\" (UniqueName: \"kubernetes.io/projected/1ad97a23-6e23-41a1-822a-15509f751031-kube-api-access-rgsfj\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.525810 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad97a23-6e23-41a1-822a-15509f751031-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.686507 4958 generic.go:334] "Generic (PLEG): container finished" podID="1ad97a23-6e23-41a1-822a-15509f751031" containerID="a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6" exitCode=0 Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.686563 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnml" event={"ID":"1ad97a23-6e23-41a1-822a-15509f751031","Type":"ContainerDied","Data":"a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6"} Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.686641 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqnml" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.686673 4958 scope.go:117] "RemoveContainer" containerID="a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.686652 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqnml" event={"ID":"1ad97a23-6e23-41a1-822a-15509f751031","Type":"ContainerDied","Data":"3bff83a3618b39d8744d99450d8d4a9378209fcfa5823bf459ce072c81d1ee42"} Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.736013 4958 scope.go:117] "RemoveContainer" containerID="7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.763640 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnml"] Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.782634 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqnml"] Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.785184 4958 scope.go:117] "RemoveContainer" containerID="d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.813025 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad97a23-6e23-41a1-822a-15509f751031" path="/var/lib/kubelet/pods/1ad97a23-6e23-41a1-822a-15509f751031/volumes" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.833727 4958 scope.go:117] "RemoveContainer" containerID="a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6" Dec 01 12:19:23 crc kubenswrapper[4958]: E1201 12:19:23.834225 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6\": container with ID starting with a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6 not found: ID does not exist" containerID="a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.834259 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6"} err="failed to get container status \"a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6\": rpc error: code = NotFound desc = could not find container \"a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6\": container with ID starting with a298d62d85b623bd2ed36493a8ee0e00ee18c8ec061ca81818e51cfb9ee58bf6 not found: ID does not exist" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.834284 4958 scope.go:117] "RemoveContainer" containerID="7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7" Dec 01 12:19:23 crc kubenswrapper[4958]: E1201 12:19:23.834763 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7\": container with ID starting with 7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7 not found: ID does not exist" containerID="7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.834830 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7"} err="failed to get container status \"7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7\": rpc error: code = NotFound desc = could not find container \"7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7\": container with ID starting with 7a16cb76ac82f144f7fcad3833e49b628faf9a8a613ff4a1231637a3d02aecf7 not found: ID does not exist" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.834887 4958 scope.go:117] "RemoveContainer" containerID="d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2" Dec 01 12:19:23 crc kubenswrapper[4958]: E1201 12:19:23.835243 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2\": container with ID starting with d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2 not found: ID does not exist" containerID="d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2" Dec 01 12:19:23 crc kubenswrapper[4958]: I1201 12:19:23.835294 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2"} err="failed to get container status \"d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2\": rpc error: code = NotFound desc = could not find container \"d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2\": container with ID starting with d0d7c51ef62857b65754ba35007b090d7887946b013847e359da4d85e91ce4b2 not found: ID does not exist" Dec 01 12:19:28 crc kubenswrapper[4958]: I1201 12:19:28.210587 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:19:28 crc kubenswrapper[4958]: I1201 12:19:28.212104 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:19:58 crc kubenswrapper[4958]: I1201 12:19:58.210624 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:19:58 crc kubenswrapper[4958]: I1201 12:19:58.211366 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:20:28 crc kubenswrapper[4958]: I1201 12:20:28.210972 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:20:28 crc kubenswrapper[4958]: I1201 12:20:28.211632 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:20:28 crc kubenswrapper[4958]: I1201 12:20:28.211707 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:20:28 crc kubenswrapper[4958]: I1201 12:20:28.212994 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:20:28 crc kubenswrapper[4958]: I1201 12:20:28.213104 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" gracePeriod=600 Dec 01 12:20:28 crc kubenswrapper[4958]: E1201 12:20:28.337593 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:20:29 crc kubenswrapper[4958]: I1201 12:20:29.052560 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" exitCode=0 Dec 01 12:20:29 crc kubenswrapper[4958]: I1201 12:20:29.052643 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3"} Dec 01 12:20:29 crc kubenswrapper[4958]: I1201 12:20:29.052707 4958 scope.go:117] "RemoveContainer" containerID="acaaa06729a7596508194b634486b60d2d4a6b7ee287eab5327fcec162540334" Dec 01 12:20:29 crc kubenswrapper[4958]: I1201 12:20:29.053637 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:20:29 crc kubenswrapper[4958]: E1201 12:20:29.054214 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:20:42 crc kubenswrapper[4958]: I1201 12:20:42.798517 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:20:42 crc kubenswrapper[4958]: E1201 12:20:42.800139 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.600373 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqf4k"] Dec 01 12:20:43 crc kubenswrapper[4958]: E1201 12:20:43.601115 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="extract-content" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.601152 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="extract-content" Dec 01 12:20:43 crc kubenswrapper[4958]: E1201 12:20:43.601202 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="extract-utilities" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.601217 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="extract-utilities" Dec 01 12:20:43 crc kubenswrapper[4958]: E1201 12:20:43.601273 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="registry-server" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.601288 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="registry-server" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.601661 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad97a23-6e23-41a1-822a-15509f751031" containerName="registry-server" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.604986 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.613121 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqf4k"] Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.785407 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxs2j\" (UniqueName: \"kubernetes.io/projected/9128a79c-75b6-46f5-ac8e-ec173e044839-kube-api-access-nxs2j\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.786292 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-utilities\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.786497 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-catalog-content\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.889822 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-utilities\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.889928 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-utilities\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.890053 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-catalog-content\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.890617 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-catalog-content\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.891084 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxs2j\" (UniqueName: \"kubernetes.io/projected/9128a79c-75b6-46f5-ac8e-ec173e044839-kube-api-access-nxs2j\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.913716 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxs2j\" (UniqueName: \"kubernetes.io/projected/9128a79c-75b6-46f5-ac8e-ec173e044839-kube-api-access-nxs2j\") pod \"redhat-operators-gqf4k\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:43 crc kubenswrapper[4958]: I1201 12:20:43.941476 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:44 crc kubenswrapper[4958]: I1201 12:20:44.528965 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqf4k"] Dec 01 12:20:45 crc kubenswrapper[4958]: I1201 12:20:45.302897 4958 generic.go:334] "Generic (PLEG): container finished" podID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerID="a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba" exitCode=0 Dec 01 12:20:45 crc kubenswrapper[4958]: I1201 12:20:45.303009 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerDied","Data":"a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba"} Dec 01 12:20:45 crc kubenswrapper[4958]: I1201 12:20:45.303352 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerStarted","Data":"2335ae795eee802e9979eda573bab2e64a26312d330f083cb681c57d58e14a74"} Dec 01 12:20:45 crc kubenswrapper[4958]: I1201 12:20:45.308547 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:20:46 crc kubenswrapper[4958]: I1201 12:20:46.319991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerStarted","Data":"322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842"} Dec 01 12:20:49 crc kubenswrapper[4958]: I1201 12:20:49.372294 4958 generic.go:334] "Generic (PLEG): container finished" podID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerID="322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842" exitCode=0 Dec 01 12:20:49 crc kubenswrapper[4958]: I1201 12:20:49.372428 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerDied","Data":"322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842"} Dec 01 12:20:51 crc kubenswrapper[4958]: I1201 12:20:51.402996 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerStarted","Data":"8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b"} Dec 01 12:20:51 crc kubenswrapper[4958]: I1201 12:20:51.439487 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqf4k" podStartSLOduration=3.509082063 podStartE2EDuration="8.439465471s" podCreationTimestamp="2025-12-01 12:20:43 +0000 UTC" firstStartedPulling="2025-12-01 12:20:45.308011844 +0000 UTC m=+8492.816800921" lastFinishedPulling="2025-12-01 12:20:50.238395242 +0000 UTC m=+8497.747184329" observedRunningTime="2025-12-01 12:20:51.426877246 +0000 UTC m=+8498.935666293" watchObservedRunningTime="2025-12-01 12:20:51.439465471 +0000 UTC m=+8498.948254508" Dec 01 12:20:53 crc kubenswrapper[4958]: I1201 12:20:53.811232 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:20:53 crc kubenswrapper[4958]: E1201 12:20:53.812318 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:20:53 crc kubenswrapper[4958]: I1201 12:20:53.942469 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:53 crc kubenswrapper[4958]: I1201 12:20:53.942527 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:20:55 crc kubenswrapper[4958]: I1201 12:20:55.008800 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqf4k" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="registry-server" probeResult="failure" output=< Dec 01 12:20:55 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:20:55 crc kubenswrapper[4958]: > Dec 01 12:21:04 crc kubenswrapper[4958]: I1201 12:21:04.030376 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:21:04 crc kubenswrapper[4958]: I1201 12:21:04.123069 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:21:04 crc kubenswrapper[4958]: I1201 12:21:04.283917 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqf4k"] Dec 01 12:21:05 crc kubenswrapper[4958]: I1201 12:21:05.596875 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqf4k" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="registry-server" containerID="cri-o://8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b" gracePeriod=2 Dec 01 12:21:05 crc kubenswrapper[4958]: I1201 12:21:05.824516 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:21:05 crc kubenswrapper[4958]: E1201 12:21:05.824817 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.260505 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.357794 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-catalog-content\") pod \"9128a79c-75b6-46f5-ac8e-ec173e044839\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.357901 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-utilities\") pod \"9128a79c-75b6-46f5-ac8e-ec173e044839\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.358019 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxs2j\" (UniqueName: \"kubernetes.io/projected/9128a79c-75b6-46f5-ac8e-ec173e044839-kube-api-access-nxs2j\") pod \"9128a79c-75b6-46f5-ac8e-ec173e044839\" (UID: \"9128a79c-75b6-46f5-ac8e-ec173e044839\") " Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.359553 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-utilities" (OuterVolumeSpecName: "utilities") pod "9128a79c-75b6-46f5-ac8e-ec173e044839" (UID: "9128a79c-75b6-46f5-ac8e-ec173e044839"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.374194 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9128a79c-75b6-46f5-ac8e-ec173e044839-kube-api-access-nxs2j" (OuterVolumeSpecName: "kube-api-access-nxs2j") pod "9128a79c-75b6-46f5-ac8e-ec173e044839" (UID: "9128a79c-75b6-46f5-ac8e-ec173e044839"). InnerVolumeSpecName "kube-api-access-nxs2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.476693 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.477307 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxs2j\" (UniqueName: \"kubernetes.io/projected/9128a79c-75b6-46f5-ac8e-ec173e044839-kube-api-access-nxs2j\") on node \"crc\" DevicePath \"\"" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.539485 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9128a79c-75b6-46f5-ac8e-ec173e044839" (UID: "9128a79c-75b6-46f5-ac8e-ec173e044839"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.579755 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9128a79c-75b6-46f5-ac8e-ec173e044839-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.612671 4958 generic.go:334] "Generic (PLEG): container finished" podID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerID="8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b" exitCode=0 Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.612754 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerDied","Data":"8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b"} Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.612884 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqf4k" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.613027 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqf4k" event={"ID":"9128a79c-75b6-46f5-ac8e-ec173e044839","Type":"ContainerDied","Data":"2335ae795eee802e9979eda573bab2e64a26312d330f083cb681c57d58e14a74"} Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.613107 4958 scope.go:117] "RemoveContainer" containerID="8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.650264 4958 scope.go:117] "RemoveContainer" containerID="322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.671041 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqf4k"] Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.681114 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqf4k"] Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.696378 4958 scope.go:117] "RemoveContainer" containerID="a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.762324 4958 scope.go:117] "RemoveContainer" containerID="8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b" Dec 01 12:21:06 crc kubenswrapper[4958]: E1201 12:21:06.762853 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b\": container with ID starting with 8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b not found: ID does not exist" containerID="8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.762893 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b"} err="failed to get container status \"8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b\": rpc error: code = NotFound desc = could not find container \"8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b\": container with ID starting with 8602c239aaae9d0201bb2ef73b9a152bf5870c04c2da1b831f7f406f57872a2b not found: ID does not exist" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.762947 4958 scope.go:117] "RemoveContainer" containerID="322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842" Dec 01 12:21:06 crc kubenswrapper[4958]: E1201 12:21:06.763910 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842\": container with ID starting with 322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842 not found: ID does not exist" containerID="322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.764015 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842"} err="failed to get container status \"322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842\": rpc error: code = NotFound desc = could not find container \"322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842\": container with ID starting with 322835258c64e3cce946ea984284313eaabfaba9c46056a4b725d5d340cfe842 not found: ID does not exist" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.764095 4958 scope.go:117] "RemoveContainer" containerID="a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba" Dec 01 12:21:06 crc kubenswrapper[4958]: E1201 12:21:06.764671 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba\": container with ID starting with a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba not found: ID does not exist" containerID="a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba" Dec 01 12:21:06 crc kubenswrapper[4958]: I1201 12:21:06.764749 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba"} err="failed to get container status \"a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba\": rpc error: code = NotFound desc = could not find container \"a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba\": container with ID starting with a2598eea5501cf76e0d35fce8f1a1c32ae613a1155498f5a226487185bb16dba not found: ID does not exist" Dec 01 12:21:07 crc kubenswrapper[4958]: I1201 12:21:07.819445 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" path="/var/lib/kubelet/pods/9128a79c-75b6-46f5-ac8e-ec173e044839/volumes" Dec 01 12:21:17 crc kubenswrapper[4958]: I1201 12:21:17.797693 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:21:17 crc kubenswrapper[4958]: E1201 12:21:17.798718 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:21:31 crc kubenswrapper[4958]: I1201 12:21:31.799247 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:21:31 crc kubenswrapper[4958]: E1201 12:21:31.800555 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:21:44 crc kubenswrapper[4958]: I1201 12:21:44.805949 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:21:44 crc kubenswrapper[4958]: E1201 12:21:44.807247 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:21:57 crc kubenswrapper[4958]: I1201 12:21:57.798418 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:21:57 crc kubenswrapper[4958]: E1201 12:21:57.800037 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:22:10 crc kubenswrapper[4958]: I1201 12:22:10.798249 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:22:10 crc kubenswrapper[4958]: E1201 12:22:10.799275 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:22:21 crc kubenswrapper[4958]: I1201 12:22:21.797295 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:22:21 crc kubenswrapper[4958]: E1201 12:22:21.798396 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:22:36 crc kubenswrapper[4958]: I1201 12:22:36.798965 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:22:36 crc kubenswrapper[4958]: E1201 12:22:36.800177 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:22:51 crc kubenswrapper[4958]: I1201 12:22:51.798371 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:22:51 crc kubenswrapper[4958]: E1201 12:22:51.799045 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:23:03 crc kubenswrapper[4958]: I1201 12:23:03.807268 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:23:03 crc kubenswrapper[4958]: E1201 12:23:03.808044 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:23:17 crc kubenswrapper[4958]: I1201 12:23:17.798378 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:23:17 crc kubenswrapper[4958]: E1201 12:23:17.799123 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:23:29 crc kubenswrapper[4958]: I1201 12:23:29.798818 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:23:29 crc kubenswrapper[4958]: E1201 12:23:29.799869 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:23:41 crc kubenswrapper[4958]: I1201 12:23:41.797766 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:23:41 crc kubenswrapper[4958]: E1201 12:23:41.798582 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:23:54 crc kubenswrapper[4958]: I1201 12:23:54.798500 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:23:54 crc kubenswrapper[4958]: E1201 12:23:54.799651 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.258923 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l5rfj"] Dec 01 12:24:01 crc kubenswrapper[4958]: E1201 12:24:01.260191 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="registry-server" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.260209 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="registry-server" Dec 01 12:24:01 crc kubenswrapper[4958]: E1201 12:24:01.260233 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="extract-content" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.260245 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="extract-content" Dec 01 12:24:01 crc kubenswrapper[4958]: E1201 12:24:01.260320 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="extract-utilities" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.260331 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="extract-utilities" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.260619 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9128a79c-75b6-46f5-ac8e-ec173e044839" containerName="registry-server" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.262769 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.271435 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l5rfj"] Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.368371 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-catalog-content\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.369054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthdw\" (UniqueName: \"kubernetes.io/projected/ded94c05-9622-4410-810f-b4e673bf1366-kube-api-access-tthdw\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.369177 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-utilities\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.471145 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthdw\" (UniqueName: \"kubernetes.io/projected/ded94c05-9622-4410-810f-b4e673bf1366-kube-api-access-tthdw\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.471265 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-utilities\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.471603 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-catalog-content\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.472231 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-utilities\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.472415 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-catalog-content\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.506424 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthdw\" (UniqueName: \"kubernetes.io/projected/ded94c05-9622-4410-810f-b4e673bf1366-kube-api-access-tthdw\") pod \"certified-operators-l5rfj\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:01 crc kubenswrapper[4958]: I1201 12:24:01.591530 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:02 crc kubenswrapper[4958]: I1201 12:24:02.204293 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l5rfj"] Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.082479 4958 generic.go:334] "Generic (PLEG): container finished" podID="ded94c05-9622-4410-810f-b4e673bf1366" containerID="e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb" exitCode=0 Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.082679 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5rfj" event={"ID":"ded94c05-9622-4410-810f-b4e673bf1366","Type":"ContainerDied","Data":"e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb"} Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.083455 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5rfj" event={"ID":"ded94c05-9622-4410-810f-b4e673bf1366","Type":"ContainerStarted","Data":"adbdd718dccc844df6ec62909dba88458536f2192c2737eb4efd2fcfbcc47e5d"} Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.252497 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cqt5j"] Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.255635 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.276163 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqt5j"] Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.338099 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-utilities\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.338281 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-catalog-content\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.338493 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpp7g\" (UniqueName: \"kubernetes.io/projected/11e55b88-6fb1-4186-bf0d-fe0c865268ce-kube-api-access-zpp7g\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.441493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-utilities\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.441732 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-catalog-content\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.442108 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpp7g\" (UniqueName: \"kubernetes.io/projected/11e55b88-6fb1-4186-bf0d-fe0c865268ce-kube-api-access-zpp7g\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.443940 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-catalog-content\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.445692 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-utilities\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.476442 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpp7g\" (UniqueName: \"kubernetes.io/projected/11e55b88-6fb1-4186-bf0d-fe0c865268ce-kube-api-access-zpp7g\") pod \"community-operators-cqt5j\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:03 crc kubenswrapper[4958]: I1201 12:24:03.592946 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:04 crc kubenswrapper[4958]: I1201 12:24:04.195105 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cqt5j"] Dec 01 12:24:05 crc kubenswrapper[4958]: I1201 12:24:05.116713 4958 generic.go:334] "Generic (PLEG): container finished" podID="ded94c05-9622-4410-810f-b4e673bf1366" containerID="f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b" exitCode=0 Dec 01 12:24:05 crc kubenswrapper[4958]: I1201 12:24:05.116828 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5rfj" event={"ID":"ded94c05-9622-4410-810f-b4e673bf1366","Type":"ContainerDied","Data":"f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b"} Dec 01 12:24:05 crc kubenswrapper[4958]: I1201 12:24:05.121540 4958 generic.go:334] "Generic (PLEG): container finished" podID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerID="63e1912d577ff78ea2159980ad1cbcfa6c8d5f39dfc81d1e0b4db6e73c999a79" exitCode=0 Dec 01 12:24:05 crc kubenswrapper[4958]: I1201 12:24:05.121630 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqt5j" event={"ID":"11e55b88-6fb1-4186-bf0d-fe0c865268ce","Type":"ContainerDied","Data":"63e1912d577ff78ea2159980ad1cbcfa6c8d5f39dfc81d1e0b4db6e73c999a79"} Dec 01 12:24:05 crc kubenswrapper[4958]: I1201 12:24:05.121679 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqt5j" event={"ID":"11e55b88-6fb1-4186-bf0d-fe0c865268ce","Type":"ContainerStarted","Data":"1e73215e979f167c426768dfcd9299c7269ca6e260a3a23618ae52314c6a94c3"} Dec 01 12:24:07 crc kubenswrapper[4958]: I1201 12:24:07.149671 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5rfj" event={"ID":"ded94c05-9622-4410-810f-b4e673bf1366","Type":"ContainerStarted","Data":"7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56"} Dec 01 12:24:07 crc kubenswrapper[4958]: I1201 12:24:07.152217 4958 generic.go:334] "Generic (PLEG): container finished" podID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerID="3ece0f1bb3cc66403d6040aa7119a901b1639d73f12c2e68956b2196c32327a9" exitCode=0 Dec 01 12:24:07 crc kubenswrapper[4958]: I1201 12:24:07.152258 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqt5j" event={"ID":"11e55b88-6fb1-4186-bf0d-fe0c865268ce","Type":"ContainerDied","Data":"3ece0f1bb3cc66403d6040aa7119a901b1639d73f12c2e68956b2196c32327a9"} Dec 01 12:24:07 crc kubenswrapper[4958]: I1201 12:24:07.175732 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l5rfj" podStartSLOduration=3.01387504 podStartE2EDuration="6.175713321s" podCreationTimestamp="2025-12-01 12:24:01 +0000 UTC" firstStartedPulling="2025-12-01 12:24:03.086908244 +0000 UTC m=+8690.595697321" lastFinishedPulling="2025-12-01 12:24:06.248746565 +0000 UTC m=+8693.757535602" observedRunningTime="2025-12-01 12:24:07.166816101 +0000 UTC m=+8694.675605138" watchObservedRunningTime="2025-12-01 12:24:07.175713321 +0000 UTC m=+8694.684502359" Dec 01 12:24:08 crc kubenswrapper[4958]: I1201 12:24:08.798187 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:24:08 crc kubenswrapper[4958]: E1201 12:24:08.800559 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:24:09 crc kubenswrapper[4958]: I1201 12:24:09.189997 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqt5j" event={"ID":"11e55b88-6fb1-4186-bf0d-fe0c865268ce","Type":"ContainerStarted","Data":"f4e7cc7605ce092576f087d0038bf7d8f83bcd6abc7509a9ea8127b2a67668f1"} Dec 01 12:24:09 crc kubenswrapper[4958]: I1201 12:24:09.250602 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cqt5j" podStartSLOduration=3.278615326 podStartE2EDuration="6.250570668s" podCreationTimestamp="2025-12-01 12:24:03 +0000 UTC" firstStartedPulling="2025-12-01 12:24:05.130914091 +0000 UTC m=+8692.639703158" lastFinishedPulling="2025-12-01 12:24:08.102869463 +0000 UTC m=+8695.611658500" observedRunningTime="2025-12-01 12:24:09.233379373 +0000 UTC m=+8696.742168440" watchObservedRunningTime="2025-12-01 12:24:09.250570668 +0000 UTC m=+8696.759359745" Dec 01 12:24:11 crc kubenswrapper[4958]: I1201 12:24:11.593229 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:11 crc kubenswrapper[4958]: I1201 12:24:11.593666 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:11 crc kubenswrapper[4958]: I1201 12:24:11.705079 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:12 crc kubenswrapper[4958]: I1201 12:24:12.323233 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:13 crc kubenswrapper[4958]: I1201 12:24:13.248515 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l5rfj"] Dec 01 12:24:13 crc kubenswrapper[4958]: I1201 12:24:13.594008 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:13 crc kubenswrapper[4958]: I1201 12:24:13.594368 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:13 crc kubenswrapper[4958]: I1201 12:24:13.685041 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.261187 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l5rfj" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="registry-server" containerID="cri-o://7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56" gracePeriod=2 Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.358929 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.834198 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.937806 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthdw\" (UniqueName: \"kubernetes.io/projected/ded94c05-9622-4410-810f-b4e673bf1366-kube-api-access-tthdw\") pod \"ded94c05-9622-4410-810f-b4e673bf1366\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.937917 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-utilities\") pod \"ded94c05-9622-4410-810f-b4e673bf1366\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.938015 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-catalog-content\") pod \"ded94c05-9622-4410-810f-b4e673bf1366\" (UID: \"ded94c05-9622-4410-810f-b4e673bf1366\") " Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.939082 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-utilities" (OuterVolumeSpecName: "utilities") pod "ded94c05-9622-4410-810f-b4e673bf1366" (UID: "ded94c05-9622-4410-810f-b4e673bf1366"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.944777 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded94c05-9622-4410-810f-b4e673bf1366-kube-api-access-tthdw" (OuterVolumeSpecName: "kube-api-access-tthdw") pod "ded94c05-9622-4410-810f-b4e673bf1366" (UID: "ded94c05-9622-4410-810f-b4e673bf1366"). InnerVolumeSpecName "kube-api-access-tthdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:24:14 crc kubenswrapper[4958]: I1201 12:24:14.999707 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ded94c05-9622-4410-810f-b4e673bf1366" (UID: "ded94c05-9622-4410-810f-b4e673bf1366"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.040629 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.040673 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthdw\" (UniqueName: \"kubernetes.io/projected/ded94c05-9622-4410-810f-b4e673bf1366-kube-api-access-tthdw\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.040690 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ded94c05-9622-4410-810f-b4e673bf1366-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.272673 4958 generic.go:334] "Generic (PLEG): container finished" podID="ded94c05-9622-4410-810f-b4e673bf1366" containerID="7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56" exitCode=0 Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.272792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5rfj" event={"ID":"ded94c05-9622-4410-810f-b4e673bf1366","Type":"ContainerDied","Data":"7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56"} Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.272868 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5rfj" event={"ID":"ded94c05-9622-4410-810f-b4e673bf1366","Type":"ContainerDied","Data":"adbdd718dccc844df6ec62909dba88458536f2192c2737eb4efd2fcfbcc47e5d"} Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.272896 4958 scope.go:117] "RemoveContainer" containerID="7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.274009 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5rfj" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.309589 4958 scope.go:117] "RemoveContainer" containerID="f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.315257 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l5rfj"] Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.327475 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l5rfj"] Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.340455 4958 scope.go:117] "RemoveContainer" containerID="e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.404199 4958 scope.go:117] "RemoveContainer" containerID="7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56" Dec 01 12:24:15 crc kubenswrapper[4958]: E1201 12:24:15.404751 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56\": container with ID starting with 7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56 not found: ID does not exist" containerID="7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.404804 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56"} err="failed to get container status \"7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56\": rpc error: code = NotFound desc = could not find container \"7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56\": container with ID starting with 7337108b946e665730389a38bd5bbdcff843e7502562f744e6b30e2fec3cfc56 not found: ID does not exist" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.404839 4958 scope.go:117] "RemoveContainer" containerID="f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b" Dec 01 12:24:15 crc kubenswrapper[4958]: E1201 12:24:15.405365 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b\": container with ID starting with f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b not found: ID does not exist" containerID="f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.405416 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b"} err="failed to get container status \"f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b\": rpc error: code = NotFound desc = could not find container \"f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b\": container with ID starting with f1d6e78d604b19c3c73ffca20ce5d39b81caaad8b8de82a52958dbc1dd05e10b not found: ID does not exist" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.405442 4958 scope.go:117] "RemoveContainer" containerID="e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb" Dec 01 12:24:15 crc kubenswrapper[4958]: E1201 12:24:15.405795 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb\": container with ID starting with e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb not found: ID does not exist" containerID="e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.406016 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb"} err="failed to get container status \"e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb\": rpc error: code = NotFound desc = could not find container \"e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb\": container with ID starting with e24d6961b784fdf8979da5ef459eefc1b7bdaa34d1d82476837d2ab9a2468edb not found: ID does not exist" Dec 01 12:24:15 crc kubenswrapper[4958]: I1201 12:24:15.818435 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded94c05-9622-4410-810f-b4e673bf1366" path="/var/lib/kubelet/pods/ded94c05-9622-4410-810f-b4e673bf1366/volumes" Dec 01 12:24:17 crc kubenswrapper[4958]: I1201 12:24:17.836594 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqt5j"] Dec 01 12:24:17 crc kubenswrapper[4958]: I1201 12:24:17.837330 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cqt5j" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="registry-server" containerID="cri-o://f4e7cc7605ce092576f087d0038bf7d8f83bcd6abc7509a9ea8127b2a67668f1" gracePeriod=2 Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.331235 4958 generic.go:334] "Generic (PLEG): container finished" podID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerID="f4e7cc7605ce092576f087d0038bf7d8f83bcd6abc7509a9ea8127b2a67668f1" exitCode=0 Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.331329 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqt5j" event={"ID":"11e55b88-6fb1-4186-bf0d-fe0c865268ce","Type":"ContainerDied","Data":"f4e7cc7605ce092576f087d0038bf7d8f83bcd6abc7509a9ea8127b2a67668f1"} Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.445738 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.524962 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpp7g\" (UniqueName: \"kubernetes.io/projected/11e55b88-6fb1-4186-bf0d-fe0c865268ce-kube-api-access-zpp7g\") pod \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.526092 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-catalog-content\") pod \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.526175 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-utilities\") pod \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\" (UID: \"11e55b88-6fb1-4186-bf0d-fe0c865268ce\") " Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.527002 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-utilities" (OuterVolumeSpecName: "utilities") pod "11e55b88-6fb1-4186-bf0d-fe0c865268ce" (UID: "11e55b88-6fb1-4186-bf0d-fe0c865268ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.532217 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e55b88-6fb1-4186-bf0d-fe0c865268ce-kube-api-access-zpp7g" (OuterVolumeSpecName: "kube-api-access-zpp7g") pod "11e55b88-6fb1-4186-bf0d-fe0c865268ce" (UID: "11e55b88-6fb1-4186-bf0d-fe0c865268ce"). InnerVolumeSpecName "kube-api-access-zpp7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.596353 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11e55b88-6fb1-4186-bf0d-fe0c865268ce" (UID: "11e55b88-6fb1-4186-bf0d-fe0c865268ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.629568 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.629624 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpp7g\" (UniqueName: \"kubernetes.io/projected/11e55b88-6fb1-4186-bf0d-fe0c865268ce-kube-api-access-zpp7g\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:18 crc kubenswrapper[4958]: I1201 12:24:18.629646 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e55b88-6fb1-4186-bf0d-fe0c865268ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.351740 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cqt5j" event={"ID":"11e55b88-6fb1-4186-bf0d-fe0c865268ce","Type":"ContainerDied","Data":"1e73215e979f167c426768dfcd9299c7269ca6e260a3a23618ae52314c6a94c3"} Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.351888 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cqt5j" Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.352060 4958 scope.go:117] "RemoveContainer" containerID="f4e7cc7605ce092576f087d0038bf7d8f83bcd6abc7509a9ea8127b2a67668f1" Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.408914 4958 scope.go:117] "RemoveContainer" containerID="3ece0f1bb3cc66403d6040aa7119a901b1639d73f12c2e68956b2196c32327a9" Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.421799 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cqt5j"] Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.434607 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cqt5j"] Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.456340 4958 scope.go:117] "RemoveContainer" containerID="63e1912d577ff78ea2159980ad1cbcfa6c8d5f39dfc81d1e0b4db6e73c999a79" Dec 01 12:24:19 crc kubenswrapper[4958]: I1201 12:24:19.823008 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" path="/var/lib/kubelet/pods/11e55b88-6fb1-4186-bf0d-fe0c865268ce/volumes" Dec 01 12:24:22 crc kubenswrapper[4958]: I1201 12:24:22.798059 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:24:22 crc kubenswrapper[4958]: E1201 12:24:22.799047 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:24:27 crc kubenswrapper[4958]: I1201 12:24:27.458906 4958 generic.go:334] "Generic (PLEG): container finished" podID="5411c83e-afd4-4c5b-a751-13fb79850c06" containerID="e1a0e12fac926bed38f29b07967333cf64b07a52c93c95b526e00bb5c5717bdf" exitCode=0 Dec 01 12:24:27 crc kubenswrapper[4958]: I1201 12:24:27.459031 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" event={"ID":"5411c83e-afd4-4c5b-a751-13fb79850c06","Type":"ContainerDied","Data":"e1a0e12fac926bed38f29b07967333cf64b07a52c93c95b526e00bb5c5717bdf"} Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.045832 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.128606 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ceph\") pod \"5411c83e-afd4-4c5b-a751-13fb79850c06\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.128687 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-combined-ca-bundle\") pod \"5411c83e-afd4-4c5b-a751-13fb79850c06\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.128776 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ssh-key\") pod \"5411c83e-afd4-4c5b-a751-13fb79850c06\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.128881 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cdjc\" (UniqueName: \"kubernetes.io/projected/5411c83e-afd4-4c5b-a751-13fb79850c06-kube-api-access-9cdjc\") pod \"5411c83e-afd4-4c5b-a751-13fb79850c06\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.128976 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-secret-0\") pod \"5411c83e-afd4-4c5b-a751-13fb79850c06\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.129045 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-inventory\") pod \"5411c83e-afd4-4c5b-a751-13fb79850c06\" (UID: \"5411c83e-afd4-4c5b-a751-13fb79850c06\") " Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.135148 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5411c83e-afd4-4c5b-a751-13fb79850c06-kube-api-access-9cdjc" (OuterVolumeSpecName: "kube-api-access-9cdjc") pod "5411c83e-afd4-4c5b-a751-13fb79850c06" (UID: "5411c83e-afd4-4c5b-a751-13fb79850c06"). InnerVolumeSpecName "kube-api-access-9cdjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.138001 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5411c83e-afd4-4c5b-a751-13fb79850c06" (UID: "5411c83e-afd4-4c5b-a751-13fb79850c06"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.144121 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ceph" (OuterVolumeSpecName: "ceph") pod "5411c83e-afd4-4c5b-a751-13fb79850c06" (UID: "5411c83e-afd4-4c5b-a751-13fb79850c06"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.166627 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5411c83e-afd4-4c5b-a751-13fb79850c06" (UID: "5411c83e-afd4-4c5b-a751-13fb79850c06"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.189748 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-inventory" (OuterVolumeSpecName: "inventory") pod "5411c83e-afd4-4c5b-a751-13fb79850c06" (UID: "5411c83e-afd4-4c5b-a751-13fb79850c06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.196147 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5411c83e-afd4-4c5b-a751-13fb79850c06" (UID: "5411c83e-afd4-4c5b-a751-13fb79850c06"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.269551 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.269596 4958 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.269608 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.270392 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cdjc\" (UniqueName: \"kubernetes.io/projected/5411c83e-afd4-4c5b-a751-13fb79850c06-kube-api-access-9cdjc\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.275472 4958 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.275547 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5411c83e-afd4-4c5b-a751-13fb79850c06-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.491734 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" event={"ID":"5411c83e-afd4-4c5b-a751-13fb79850c06","Type":"ContainerDied","Data":"8f9db40d71ef625372ddb4afc7a7e5e2e5f05c0914938b59650f8373086cb0cb"} Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.491800 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f9db40d71ef625372ddb4afc7a7e5e2e5f05c0914938b59650f8373086cb0cb" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.491882 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-lp72z" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.628234 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vqmp4"] Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631209 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="extract-content" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631228 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="extract-content" Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631251 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="extract-utilities" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631261 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="extract-utilities" Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631284 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5411c83e-afd4-4c5b-a751-13fb79850c06" containerName="libvirt-openstack-openstack-cell1" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631293 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5411c83e-afd4-4c5b-a751-13fb79850c06" containerName="libvirt-openstack-openstack-cell1" Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631334 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="extract-content" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631343 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="extract-content" Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631354 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="extract-utilities" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631362 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="extract-utilities" Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631382 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="registry-server" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631390 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="registry-server" Dec 01 12:24:29 crc kubenswrapper[4958]: E1201 12:24:29.631401 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="registry-server" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631408 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="registry-server" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631691 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded94c05-9622-4410-810f-b4e673bf1366" containerName="registry-server" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631722 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e55b88-6fb1-4186-bf0d-fe0c865268ce" containerName="registry-server" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.631739 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5411c83e-afd4-4c5b-a751-13fb79850c06" containerName="libvirt-openstack-openstack-cell1" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.632675 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.637140 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.637414 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.637501 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.637703 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.640573 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.640942 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.641277 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.647893 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vqmp4"] Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.687739 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2svzm\" (UniqueName: \"kubernetes.io/projected/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-kube-api-access-2svzm\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.687804 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.687861 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.687892 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688112 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688148 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ceph\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688206 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688241 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688301 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688344 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.688380 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790225 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790309 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2svzm\" (UniqueName: \"kubernetes.io/projected/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-kube-api-access-2svzm\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790362 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790393 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790414 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790519 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790549 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ceph\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790595 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790629 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790662 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.790707 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.791838 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.792062 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.795191 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.795626 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.795728 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.796729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.797018 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-inventory\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.798902 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.799497 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.799738 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ceph\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.813771 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2svzm\" (UniqueName: \"kubernetes.io/projected/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-kube-api-access-2svzm\") pod \"nova-cell1-openstack-openstack-cell1-vqmp4\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:29 crc kubenswrapper[4958]: I1201 12:24:29.972975 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:24:30 crc kubenswrapper[4958]: I1201 12:24:30.599602 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-vqmp4"] Dec 01 12:24:31 crc kubenswrapper[4958]: I1201 12:24:31.522163 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" event={"ID":"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb","Type":"ContainerStarted","Data":"bb148d152cbd01b8f249bac6f0dc0fe0c60db7b425efbbaf596eafaaa61b6617"} Dec 01 12:24:31 crc kubenswrapper[4958]: I1201 12:24:31.522509 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" event={"ID":"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb","Type":"ContainerStarted","Data":"f8bdf0e548bbb0582b7194e55618cb59b18a1f4e03cf809b83d8e7686c0af52b"} Dec 01 12:24:31 crc kubenswrapper[4958]: I1201 12:24:31.559464 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" podStartSLOduration=2.315017468 podStartE2EDuration="2.559446564s" podCreationTimestamp="2025-12-01 12:24:29 +0000 UTC" firstStartedPulling="2025-12-01 12:24:30.602917365 +0000 UTC m=+8718.111706442" lastFinishedPulling="2025-12-01 12:24:30.847346461 +0000 UTC m=+8718.356135538" observedRunningTime="2025-12-01 12:24:31.551423748 +0000 UTC m=+8719.060212795" watchObservedRunningTime="2025-12-01 12:24:31.559446564 +0000 UTC m=+8719.068235601" Dec 01 12:24:33 crc kubenswrapper[4958]: I1201 12:24:33.808573 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:24:33 crc kubenswrapper[4958]: E1201 12:24:33.809566 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:24:47 crc kubenswrapper[4958]: I1201 12:24:47.798325 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:24:47 crc kubenswrapper[4958]: E1201 12:24:47.801519 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:25:02 crc kubenswrapper[4958]: I1201 12:25:02.798934 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:25:02 crc kubenswrapper[4958]: E1201 12:25:02.800418 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:25:17 crc kubenswrapper[4958]: I1201 12:25:17.803920 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:25:17 crc kubenswrapper[4958]: E1201 12:25:17.805324 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:25:29 crc kubenswrapper[4958]: I1201 12:25:29.540353 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:25:30 crc kubenswrapper[4958]: I1201 12:25:30.607884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"7e636a16346e4120743b3f3b9c8aa4fcd6ad26002a15f35baf0c51f704182b0b"} Dec 01 12:27:58 crc kubenswrapper[4958]: I1201 12:27:58.210706 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:27:58 crc kubenswrapper[4958]: I1201 12:27:58.214082 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:28:28 crc kubenswrapper[4958]: I1201 12:28:28.210718 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:28:28 crc kubenswrapper[4958]: I1201 12:28:28.211541 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:28:31 crc kubenswrapper[4958]: I1201 12:28:31.224747 4958 generic.go:334] "Generic (PLEG): container finished" podID="d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" containerID="bb148d152cbd01b8f249bac6f0dc0fe0c60db7b425efbbaf596eafaaa61b6617" exitCode=0 Dec 01 12:28:31 crc kubenswrapper[4958]: I1201 12:28:31.225060 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" event={"ID":"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb","Type":"ContainerDied","Data":"bb148d152cbd01b8f249bac6f0dc0fe0c60db7b425efbbaf596eafaaa61b6617"} Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.840018 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984269 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ssh-key\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984406 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-1\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984515 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-1\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984558 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2svzm\" (UniqueName: \"kubernetes.io/projected/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-kube-api-access-2svzm\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984656 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-0\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984712 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ceph\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984746 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-0\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984802 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-0\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984864 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-combined-ca-bundle\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984920 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-inventory\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.984956 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-1\") pod \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\" (UID: \"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb\") " Dec 01 12:28:32 crc kubenswrapper[4958]: I1201 12:28:32.992910 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ceph" (OuterVolumeSpecName: "ceph") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.007669 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.011056 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-kube-api-access-2svzm" (OuterVolumeSpecName: "kube-api-access-2svzm") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "kube-api-access-2svzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.017805 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.020105 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.029442 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.032404 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.032423 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.035181 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-inventory" (OuterVolumeSpecName: "inventory") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.038952 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.055503 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" (UID: "d53c43c2-4919-4a0c-a4a9-477cd52cc3fb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089241 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089278 4958 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089291 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089300 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089309 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089319 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2svzm\" (UniqueName: \"kubernetes.io/projected/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-kube-api-access-2svzm\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089327 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089336 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089346 4958 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089356 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.089366 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d53c43c2-4919-4a0c-a4a9-477cd52cc3fb-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.255535 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" event={"ID":"d53c43c2-4919-4a0c-a4a9-477cd52cc3fb","Type":"ContainerDied","Data":"f8bdf0e548bbb0582b7194e55618cb59b18a1f4e03cf809b83d8e7686c0af52b"} Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.255605 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8bdf0e548bbb0582b7194e55618cb59b18a1f4e03cf809b83d8e7686c0af52b" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.255631 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-vqmp4" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.367035 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zz6g"] Dec 01 12:28:33 crc kubenswrapper[4958]: E1201 12:28:33.367686 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" containerName="nova-cell1-openstack-openstack-cell1" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.367711 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" containerName="nova-cell1-openstack-openstack-cell1" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.368067 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d53c43c2-4919-4a0c-a4a9-477cd52cc3fb" containerName="nova-cell1-openstack-openstack-cell1" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.369166 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.372510 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.372934 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.374956 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.375423 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.375796 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.400648 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zz6g"] Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.498507 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-inventory\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.498889 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65js2\" (UniqueName: \"kubernetes.io/projected/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-kube-api-access-65js2\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.498930 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceph\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.498957 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.498986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.499010 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ssh-key\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.499091 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.499127 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.603254 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.603383 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.603449 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ssh-key\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.603606 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.603696 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.603831 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-inventory\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.604107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65js2\" (UniqueName: \"kubernetes.io/projected/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-kube-api-access-65js2\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.604197 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceph\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.612906 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ssh-key\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.628808 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.629410 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.630150 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-inventory\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.636804 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceph\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.643485 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65js2\" (UniqueName: \"kubernetes.io/projected/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-kube-api-access-65js2\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.644010 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.649885 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zz6g\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:33 crc kubenswrapper[4958]: I1201 12:28:33.737216 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:28:34 crc kubenswrapper[4958]: I1201 12:28:34.300403 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zz6g"] Dec 01 12:28:34 crc kubenswrapper[4958]: W1201 12:28:34.304609 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb66c499_dbc3_406f_984b_9c0a6d8c94a5.slice/crio-d2e516a42b573a3e097c0935297898d6cf16fc24ba0482f624b53240127d136e WatchSource:0}: Error finding container d2e516a42b573a3e097c0935297898d6cf16fc24ba0482f624b53240127d136e: Status 404 returned error can't find the container with id d2e516a42b573a3e097c0935297898d6cf16fc24ba0482f624b53240127d136e Dec 01 12:28:34 crc kubenswrapper[4958]: I1201 12:28:34.308358 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:28:35 crc kubenswrapper[4958]: I1201 12:28:35.277600 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" event={"ID":"bb66c499-dbc3-406f-984b-9c0a6d8c94a5","Type":"ContainerStarted","Data":"1ab54e7b8953e6edd88432982239711ca6690dea278204a9918a3da5682824b8"} Dec 01 12:28:35 crc kubenswrapper[4958]: I1201 12:28:35.278158 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" event={"ID":"bb66c499-dbc3-406f-984b-9c0a6d8c94a5","Type":"ContainerStarted","Data":"d2e516a42b573a3e097c0935297898d6cf16fc24ba0482f624b53240127d136e"} Dec 01 12:28:35 crc kubenswrapper[4958]: I1201 12:28:35.306705 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" podStartSLOduration=2.107078948 podStartE2EDuration="2.306680641s" podCreationTimestamp="2025-12-01 12:28:33 +0000 UTC" firstStartedPulling="2025-12-01 12:28:34.308149119 +0000 UTC m=+8961.816938156" lastFinishedPulling="2025-12-01 12:28:34.507750812 +0000 UTC m=+8962.016539849" observedRunningTime="2025-12-01 12:28:35.30272582 +0000 UTC m=+8962.811514867" watchObservedRunningTime="2025-12-01 12:28:35.306680641 +0000 UTC m=+8962.815469678" Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.210517 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.211239 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.211302 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.212565 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e636a16346e4120743b3f3b9c8aa4fcd6ad26002a15f35baf0c51f704182b0b"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.212643 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://7e636a16346e4120743b3f3b9c8aa4fcd6ad26002a15f35baf0c51f704182b0b" gracePeriod=600 Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.704233 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="7e636a16346e4120743b3f3b9c8aa4fcd6ad26002a15f35baf0c51f704182b0b" exitCode=0 Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.704456 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"7e636a16346e4120743b3f3b9c8aa4fcd6ad26002a15f35baf0c51f704182b0b"} Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.704746 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6"} Dec 01 12:28:58 crc kubenswrapper[4958]: I1201 12:28:58.704788 4958 scope.go:117] "RemoveContainer" containerID="f6a84cea2f6c31c2e991cfff7f79f8b27b044e8a624776ca16ff9c15f04121b3" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.631996 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q9lgf"] Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.637233 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.643864 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9lgf"] Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.815961 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-catalog-content\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.816320 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-utilities\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.816672 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5wt\" (UniqueName: \"kubernetes.io/projected/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-kube-api-access-qq5wt\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.919138 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-utilities\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.919372 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5wt\" (UniqueName: \"kubernetes.io/projected/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-kube-api-access-qq5wt\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.919631 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-catalog-content\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.920002 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-utilities\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.920397 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-catalog-content\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.952505 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5wt\" (UniqueName: \"kubernetes.io/projected/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-kube-api-access-qq5wt\") pod \"redhat-marketplace-q9lgf\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:58 crc kubenswrapper[4958]: I1201 12:29:58.964546 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:29:59 crc kubenswrapper[4958]: I1201 12:29:59.551572 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9lgf"] Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.193340 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw"] Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.197449 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.200768 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.201519 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.217617 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw"] Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.262367 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-config-volume\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.262644 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-secret-volume\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.262788 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcz7p\" (UniqueName: \"kubernetes.io/projected/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-kube-api-access-kcz7p\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.364669 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-config-volume\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.364720 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-secret-volume\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.364946 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcz7p\" (UniqueName: \"kubernetes.io/projected/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-kube-api-access-kcz7p\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.365775 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-config-volume\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.376621 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-secret-volume\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.386165 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcz7p\" (UniqueName: \"kubernetes.io/projected/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-kube-api-access-kcz7p\") pod \"collect-profiles-29409870-9kjgw\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.540719 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.590484 4958 generic.go:334] "Generic (PLEG): container finished" podID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerID="555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253" exitCode=0 Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.590575 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9lgf" event={"ID":"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8","Type":"ContainerDied","Data":"555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253"} Dec 01 12:30:00 crc kubenswrapper[4958]: I1201 12:30:00.590622 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9lgf" event={"ID":"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8","Type":"ContainerStarted","Data":"b9a00e563786e48cadd3e0c62669c3488f75dcf720dacc57432811f0071b5768"} Dec 01 12:30:01 crc kubenswrapper[4958]: I1201 12:30:01.083305 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw"] Dec 01 12:30:01 crc kubenswrapper[4958]: I1201 12:30:01.601021 4958 generic.go:334] "Generic (PLEG): container finished" podID="c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" containerID="1c37e24a5209fdf70437f72c91931708fc6463f4afce0066e8f39e7c716c3de7" exitCode=0 Dec 01 12:30:01 crc kubenswrapper[4958]: I1201 12:30:01.601100 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" event={"ID":"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c","Type":"ContainerDied","Data":"1c37e24a5209fdf70437f72c91931708fc6463f4afce0066e8f39e7c716c3de7"} Dec 01 12:30:01 crc kubenswrapper[4958]: I1201 12:30:01.601358 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" event={"ID":"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c","Type":"ContainerStarted","Data":"7dcdf35ae2312279679079c104bd9167750765f0afa29253037cf88dcc73cc78"} Dec 01 12:30:02 crc kubenswrapper[4958]: I1201 12:30:02.620769 4958 generic.go:334] "Generic (PLEG): container finished" podID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerID="924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346" exitCode=0 Dec 01 12:30:02 crc kubenswrapper[4958]: I1201 12:30:02.620888 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9lgf" event={"ID":"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8","Type":"ContainerDied","Data":"924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346"} Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.072296 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.160082 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-config-volume\") pod \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.160331 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcz7p\" (UniqueName: \"kubernetes.io/projected/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-kube-api-access-kcz7p\") pod \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.160441 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-secret-volume\") pod \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\" (UID: \"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c\") " Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.161005 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" (UID: "c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.166624 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" (UID: "c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.170232 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-kube-api-access-kcz7p" (OuterVolumeSpecName: "kube-api-access-kcz7p") pod "c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" (UID: "c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c"). InnerVolumeSpecName "kube-api-access-kcz7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.263879 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcz7p\" (UniqueName: \"kubernetes.io/projected/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-kube-api-access-kcz7p\") on node \"crc\" DevicePath \"\"" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.263920 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.263936 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.638975 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" event={"ID":"c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c","Type":"ContainerDied","Data":"7dcdf35ae2312279679079c104bd9167750765f0afa29253037cf88dcc73cc78"} Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.639317 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dcdf35ae2312279679079c104bd9167750765f0afa29253037cf88dcc73cc78" Dec 01 12:30:03 crc kubenswrapper[4958]: I1201 12:30:03.639022 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409870-9kjgw" Dec 01 12:30:04 crc kubenswrapper[4958]: I1201 12:30:04.207211 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk"] Dec 01 12:30:04 crc kubenswrapper[4958]: I1201 12:30:04.222332 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409825-v2plk"] Dec 01 12:30:04 crc kubenswrapper[4958]: I1201 12:30:04.653531 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9lgf" event={"ID":"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8","Type":"ContainerStarted","Data":"3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e"} Dec 01 12:30:04 crc kubenswrapper[4958]: I1201 12:30:04.674938 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q9lgf" podStartSLOduration=3.82987693 podStartE2EDuration="6.674916305s" podCreationTimestamp="2025-12-01 12:29:58 +0000 UTC" firstStartedPulling="2025-12-01 12:30:00.609075715 +0000 UTC m=+9048.117864762" lastFinishedPulling="2025-12-01 12:30:03.45411507 +0000 UTC m=+9050.962904137" observedRunningTime="2025-12-01 12:30:04.67296158 +0000 UTC m=+9052.181750637" watchObservedRunningTime="2025-12-01 12:30:04.674916305 +0000 UTC m=+9052.183705342" Dec 01 12:30:05 crc kubenswrapper[4958]: I1201 12:30:05.818407 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f84411-fc29-413f-bdc8-9da0ec321a21" path="/var/lib/kubelet/pods/27f84411-fc29-413f-bdc8-9da0ec321a21/volumes" Dec 01 12:30:08 crc kubenswrapper[4958]: I1201 12:30:08.964840 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:30:08 crc kubenswrapper[4958]: I1201 12:30:08.965485 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:30:09 crc kubenswrapper[4958]: I1201 12:30:09.035536 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:30:09 crc kubenswrapper[4958]: I1201 12:30:09.841930 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:30:09 crc kubenswrapper[4958]: I1201 12:30:09.912830 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9lgf"] Dec 01 12:30:11 crc kubenswrapper[4958]: I1201 12:30:11.849996 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q9lgf" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="registry-server" containerID="cri-o://3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e" gracePeriod=2 Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.459717 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.562722 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-catalog-content\") pod \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.562880 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5wt\" (UniqueName: \"kubernetes.io/projected/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-kube-api-access-qq5wt\") pod \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.563205 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-utilities\") pod \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\" (UID: \"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8\") " Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.564600 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-utilities" (OuterVolumeSpecName: "utilities") pod "c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" (UID: "c782b85b-c0b5-48a4-a7b3-1b0b95da13a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.570066 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-kube-api-access-qq5wt" (OuterVolumeSpecName: "kube-api-access-qq5wt") pod "c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" (UID: "c782b85b-c0b5-48a4-a7b3-1b0b95da13a8"). InnerVolumeSpecName "kube-api-access-qq5wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.604774 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" (UID: "c782b85b-c0b5-48a4-a7b3-1b0b95da13a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.665885 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.665927 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.665942 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq5wt\" (UniqueName: \"kubernetes.io/projected/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8-kube-api-access-qq5wt\") on node \"crc\" DevicePath \"\"" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.875202 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q9lgf" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.875231 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9lgf" event={"ID":"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8","Type":"ContainerDied","Data":"3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e"} Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.875333 4958 scope.go:117] "RemoveContainer" containerID="3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.875093 4958 generic.go:334] "Generic (PLEG): container finished" podID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerID="3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e" exitCode=0 Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.875533 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q9lgf" event={"ID":"c782b85b-c0b5-48a4-a7b3-1b0b95da13a8","Type":"ContainerDied","Data":"b9a00e563786e48cadd3e0c62669c3488f75dcf720dacc57432811f0071b5768"} Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.929998 4958 scope.go:117] "RemoveContainer" containerID="924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346" Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.944440 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9lgf"] Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.966137 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q9lgf"] Dec 01 12:30:12 crc kubenswrapper[4958]: I1201 12:30:12.970629 4958 scope.go:117] "RemoveContainer" containerID="555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.029341 4958 scope.go:117] "RemoveContainer" containerID="3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e" Dec 01 12:30:13 crc kubenswrapper[4958]: E1201 12:30:13.030061 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e\": container with ID starting with 3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e not found: ID does not exist" containerID="3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.030113 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e"} err="failed to get container status \"3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e\": rpc error: code = NotFound desc = could not find container \"3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e\": container with ID starting with 3156347fc16bb2a503ede770c6fea6548c47cd0c525591a9c47d28ce99fcae0e not found: ID does not exist" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.030141 4958 scope.go:117] "RemoveContainer" containerID="924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346" Dec 01 12:30:13 crc kubenswrapper[4958]: E1201 12:30:13.030611 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346\": container with ID starting with 924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346 not found: ID does not exist" containerID="924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.030685 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346"} err="failed to get container status \"924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346\": rpc error: code = NotFound desc = could not find container \"924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346\": container with ID starting with 924616c90fb6b8833d9383b9c93b97506f639b3d9cebb6a2eff90eaf1d757346 not found: ID does not exist" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.030754 4958 scope.go:117] "RemoveContainer" containerID="555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253" Dec 01 12:30:13 crc kubenswrapper[4958]: E1201 12:30:13.031273 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253\": container with ID starting with 555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253 not found: ID does not exist" containerID="555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.031452 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253"} err="failed to get container status \"555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253\": rpc error: code = NotFound desc = could not find container \"555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253\": container with ID starting with 555c3ecb0d34bf431b6906bece3cb1a28b454e00c182f61e3116ca0322930253 not found: ID does not exist" Dec 01 12:30:13 crc kubenswrapper[4958]: I1201 12:30:13.814040 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" path="/var/lib/kubelet/pods/c782b85b-c0b5-48a4-a7b3-1b0b95da13a8/volumes" Dec 01 12:30:32 crc kubenswrapper[4958]: I1201 12:30:32.009484 4958 scope.go:117] "RemoveContainer" containerID="70438da6be65a8197c4540898600a3cf237b161ab7fc8c041dc9adfd5ba154b7" Dec 01 12:30:58 crc kubenswrapper[4958]: I1201 12:30:58.210719 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:30:58 crc kubenswrapper[4958]: I1201 12:30:58.211553 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:31:28 crc kubenswrapper[4958]: I1201 12:31:28.212080 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:31:28 crc kubenswrapper[4958]: I1201 12:31:28.212928 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.210986 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.211691 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.211760 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.212971 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.213051 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" gracePeriod=600 Dec 01 12:31:58 crc kubenswrapper[4958]: E1201 12:31:58.344443 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.351071 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" exitCode=0 Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.351151 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6"} Dec 01 12:31:58 crc kubenswrapper[4958]: I1201 12:31:58.351223 4958 scope.go:117] "RemoveContainer" containerID="7e636a16346e4120743b3f3b9c8aa4fcd6ad26002a15f35baf0c51f704182b0b" Dec 01 12:31:59 crc kubenswrapper[4958]: I1201 12:31:59.363660 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:31:59 crc kubenswrapper[4958]: E1201 12:31:59.364260 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:32:13 crc kubenswrapper[4958]: I1201 12:32:13.814677 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:32:13 crc kubenswrapper[4958]: E1201 12:32:13.816007 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:32:27 crc kubenswrapper[4958]: I1201 12:32:27.798259 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:32:27 crc kubenswrapper[4958]: E1201 12:32:27.799564 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:32:40 crc kubenswrapper[4958]: I1201 12:32:40.798035 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:32:40 crc kubenswrapper[4958]: E1201 12:32:40.798921 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:32:52 crc kubenswrapper[4958]: I1201 12:32:52.798635 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:32:52 crc kubenswrapper[4958]: E1201 12:32:52.800458 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:32:57 crc kubenswrapper[4958]: I1201 12:32:57.190948 4958 generic.go:334] "Generic (PLEG): container finished" podID="bb66c499-dbc3-406f-984b-9c0a6d8c94a5" containerID="1ab54e7b8953e6edd88432982239711ca6690dea278204a9918a3da5682824b8" exitCode=0 Dec 01 12:32:57 crc kubenswrapper[4958]: I1201 12:32:57.191055 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" event={"ID":"bb66c499-dbc3-406f-984b-9c0a6d8c94a5","Type":"ContainerDied","Data":"1ab54e7b8953e6edd88432982239711ca6690dea278204a9918a3da5682824b8"} Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.816598 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.911734 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65js2\" (UniqueName: \"kubernetes.io/projected/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-kube-api-access-65js2\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912266 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-telemetry-combined-ca-bundle\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912332 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-1\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912437 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ssh-key\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912553 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceph\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912598 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-2\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912685 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-inventory\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.912776 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-0\") pod \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\" (UID: \"bb66c499-dbc3-406f-984b-9c0a6d8c94a5\") " Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.919433 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceph" (OuterVolumeSpecName: "ceph") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.920911 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.935017 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-kube-api-access-65js2" (OuterVolumeSpecName: "kube-api-access-65js2") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "kube-api-access-65js2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.952325 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.963222 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.963310 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.964029 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:58 crc kubenswrapper[4958]: I1201 12:32:58.978670 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-inventory" (OuterVolumeSpecName: "inventory") pod "bb66c499-dbc3-406f-984b-9c0a6d8c94a5" (UID: "bb66c499-dbc3-406f-984b-9c0a6d8c94a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015806 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65js2\" (UniqueName: \"kubernetes.io/projected/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-kube-api-access-65js2\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015853 4958 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015865 4958 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015875 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015884 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015893 4958 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015902 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.015911 4958 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bb66c499-dbc3-406f-984b-9c0a6d8c94a5-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.222051 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" event={"ID":"bb66c499-dbc3-406f-984b-9c0a6d8c94a5","Type":"ContainerDied","Data":"d2e516a42b573a3e097c0935297898d6cf16fc24ba0482f624b53240127d136e"} Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.222114 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2e516a42b573a3e097c0935297898d6cf16fc24ba0482f624b53240127d136e" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.222201 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zz6g" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.341720 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-d49jb"] Dec 01 12:32:59 crc kubenswrapper[4958]: E1201 12:32:59.342957 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="extract-content" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.342978 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="extract-content" Dec 01 12:32:59 crc kubenswrapper[4958]: E1201 12:32:59.343019 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" containerName="collect-profiles" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343029 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" containerName="collect-profiles" Dec 01 12:32:59 crc kubenswrapper[4958]: E1201 12:32:59.343043 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb66c499-dbc3-406f-984b-9c0a6d8c94a5" containerName="telemetry-openstack-openstack-cell1" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343050 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb66c499-dbc3-406f-984b-9c0a6d8c94a5" containerName="telemetry-openstack-openstack-cell1" Dec 01 12:32:59 crc kubenswrapper[4958]: E1201 12:32:59.343059 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="registry-server" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343064 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="registry-server" Dec 01 12:32:59 crc kubenswrapper[4958]: E1201 12:32:59.343075 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="extract-utilities" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343081 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="extract-utilities" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343260 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dd781a-6988-4fa5-9d4d-ee8d90cdb59c" containerName="collect-profiles" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343293 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c782b85b-c0b5-48a4-a7b3-1b0b95da13a8" containerName="registry-server" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.343318 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb66c499-dbc3-406f-984b-9c0a6d8c94a5" containerName="telemetry-openstack-openstack-cell1" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.344098 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.346982 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.347353 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.347554 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.347719 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.347968 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.364213 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-d49jb"] Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.424335 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58m56\" (UniqueName: \"kubernetes.io/projected/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-kube-api-access-58m56\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.424393 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.424413 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.424789 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.425015 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.425270 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.527166 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58m56\" (UniqueName: \"kubernetes.io/projected/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-kube-api-access-58m56\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.527235 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.527265 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.527385 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.527455 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.527521 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.536679 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.536722 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.538729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.538939 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.539447 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.645060 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58m56\" (UniqueName: \"kubernetes.io/projected/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-kube-api-access-58m56\") pod \"neutron-sriov-openstack-openstack-cell1-d49jb\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:32:59 crc kubenswrapper[4958]: I1201 12:32:59.673366 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:33:00 crc kubenswrapper[4958]: I1201 12:33:00.311180 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-d49jb"] Dec 01 12:33:01 crc kubenswrapper[4958]: I1201 12:33:01.245093 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" event={"ID":"a688b48f-67dc-4fce-a92f-f7bf69d1ad29","Type":"ContainerStarted","Data":"e25e99b8990978c8ec5e6bf72b9c1f37c78b81e4e41475584c4cf9b37360ec62"} Dec 01 12:33:01 crc kubenswrapper[4958]: I1201 12:33:01.245735 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" event={"ID":"a688b48f-67dc-4fce-a92f-f7bf69d1ad29","Type":"ContainerStarted","Data":"01dc80fc7088dcbb072cffebdebf0c442bf1a4d6f99e91b339e165cc34e4da55"} Dec 01 12:33:01 crc kubenswrapper[4958]: I1201 12:33:01.278106 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" podStartSLOduration=2.080806747 podStartE2EDuration="2.278083888s" podCreationTimestamp="2025-12-01 12:32:59 +0000 UTC" firstStartedPulling="2025-12-01 12:33:00.317511847 +0000 UTC m=+9227.826300884" lastFinishedPulling="2025-12-01 12:33:00.514788978 +0000 UTC m=+9228.023578025" observedRunningTime="2025-12-01 12:33:01.267700266 +0000 UTC m=+9228.776489323" watchObservedRunningTime="2025-12-01 12:33:01.278083888 +0000 UTC m=+9228.786872925" Dec 01 12:33:07 crc kubenswrapper[4958]: I1201 12:33:07.798918 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:33:07 crc kubenswrapper[4958]: E1201 12:33:07.800554 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:33:20 crc kubenswrapper[4958]: I1201 12:33:20.797669 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:33:20 crc kubenswrapper[4958]: E1201 12:33:20.798371 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:33:32 crc kubenswrapper[4958]: I1201 12:33:32.798472 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:33:32 crc kubenswrapper[4958]: E1201 12:33:32.799451 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:33:43 crc kubenswrapper[4958]: I1201 12:33:43.806222 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:33:43 crc kubenswrapper[4958]: E1201 12:33:43.806918 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:33:56 crc kubenswrapper[4958]: I1201 12:33:56.798039 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:33:56 crc kubenswrapper[4958]: E1201 12:33:56.799246 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.617982 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lcsxv"] Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.625630 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.646242 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcsxv"] Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.777605 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34755b-2dcb-4ca5-a680-0fffbf319417-utilities\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.777700 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34755b-2dcb-4ca5-a680-0fffbf319417-catalog-content\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.777730 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88cm2\" (UniqueName: \"kubernetes.io/projected/cd34755b-2dcb-4ca5-a680-0fffbf319417-kube-api-access-88cm2\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.879718 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34755b-2dcb-4ca5-a680-0fffbf319417-utilities\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.879808 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34755b-2dcb-4ca5-a680-0fffbf319417-catalog-content\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.880199 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88cm2\" (UniqueName: \"kubernetes.io/projected/cd34755b-2dcb-4ca5-a680-0fffbf319417-kube-api-access-88cm2\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.880425 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd34755b-2dcb-4ca5-a680-0fffbf319417-catalog-content\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.880662 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd34755b-2dcb-4ca5-a680-0fffbf319417-utilities\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.903582 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88cm2\" (UniqueName: \"kubernetes.io/projected/cd34755b-2dcb-4ca5-a680-0fffbf319417-kube-api-access-88cm2\") pod \"community-operators-lcsxv\" (UID: \"cd34755b-2dcb-4ca5-a680-0fffbf319417\") " pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:07 crc kubenswrapper[4958]: I1201 12:34:07.967664 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:08 crc kubenswrapper[4958]: I1201 12:34:08.479494 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcsxv"] Dec 01 12:34:08 crc kubenswrapper[4958]: W1201 12:34:08.489761 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd34755b_2dcb_4ca5_a680_0fffbf319417.slice/crio-35ebb39af87bf2dd48ef73181d143752d4fe1f1e6edddf2ed296ddd500a46882 WatchSource:0}: Error finding container 35ebb39af87bf2dd48ef73181d143752d4fe1f1e6edddf2ed296ddd500a46882: Status 404 returned error can't find the container with id 35ebb39af87bf2dd48ef73181d143752d4fe1f1e6edddf2ed296ddd500a46882 Dec 01 12:34:08 crc kubenswrapper[4958]: I1201 12:34:08.799180 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:34:08 crc kubenswrapper[4958]: E1201 12:34:08.800116 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:34:09 crc kubenswrapper[4958]: I1201 12:34:09.422266 4958 generic.go:334] "Generic (PLEG): container finished" podID="cd34755b-2dcb-4ca5-a680-0fffbf319417" containerID="8f10b73ad7cd0fbd69ffdd213379ff74ff01d7ba2e3206af567ff90846a99895" exitCode=0 Dec 01 12:34:09 crc kubenswrapper[4958]: I1201 12:34:09.422366 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxv" event={"ID":"cd34755b-2dcb-4ca5-a680-0fffbf319417","Type":"ContainerDied","Data":"8f10b73ad7cd0fbd69ffdd213379ff74ff01d7ba2e3206af567ff90846a99895"} Dec 01 12:34:09 crc kubenswrapper[4958]: I1201 12:34:09.422716 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxv" event={"ID":"cd34755b-2dcb-4ca5-a680-0fffbf319417","Type":"ContainerStarted","Data":"35ebb39af87bf2dd48ef73181d143752d4fe1f1e6edddf2ed296ddd500a46882"} Dec 01 12:34:09 crc kubenswrapper[4958]: I1201 12:34:09.426378 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:34:15 crc kubenswrapper[4958]: I1201 12:34:15.513923 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxv" event={"ID":"cd34755b-2dcb-4ca5-a680-0fffbf319417","Type":"ContainerStarted","Data":"dbcca4fada9779fd78777e6f2f6bd49792dcbe60673029e8e28e6edb7ebe8d2a"} Dec 01 12:34:16 crc kubenswrapper[4958]: I1201 12:34:16.535118 4958 generic.go:334] "Generic (PLEG): container finished" podID="cd34755b-2dcb-4ca5-a680-0fffbf319417" containerID="dbcca4fada9779fd78777e6f2f6bd49792dcbe60673029e8e28e6edb7ebe8d2a" exitCode=0 Dec 01 12:34:16 crc kubenswrapper[4958]: I1201 12:34:16.535533 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxv" event={"ID":"cd34755b-2dcb-4ca5-a680-0fffbf319417","Type":"ContainerDied","Data":"dbcca4fada9779fd78777e6f2f6bd49792dcbe60673029e8e28e6edb7ebe8d2a"} Dec 01 12:34:17 crc kubenswrapper[4958]: I1201 12:34:17.562583 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lcsxv" event={"ID":"cd34755b-2dcb-4ca5-a680-0fffbf319417","Type":"ContainerStarted","Data":"2a0b0dee48dbfb2cdb82c68a5e8b9c35d8156d6d0aad9ae3aa0a1cc30fa79dad"} Dec 01 12:34:17 crc kubenswrapper[4958]: I1201 12:34:17.968183 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:17 crc kubenswrapper[4958]: I1201 12:34:17.968268 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:19 crc kubenswrapper[4958]: I1201 12:34:19.049162 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lcsxv" podUID="cd34755b-2dcb-4ca5-a680-0fffbf319417" containerName="registry-server" probeResult="failure" output=< Dec 01 12:34:19 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:34:19 crc kubenswrapper[4958]: > Dec 01 12:34:22 crc kubenswrapper[4958]: I1201 12:34:22.798306 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:34:22 crc kubenswrapper[4958]: E1201 12:34:22.799384 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.057998 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.097736 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lcsxv" podStartSLOduration=13.460475402 podStartE2EDuration="21.097679471s" podCreationTimestamp="2025-12-01 12:34:07 +0000 UTC" firstStartedPulling="2025-12-01 12:34:09.425948521 +0000 UTC m=+9296.934737598" lastFinishedPulling="2025-12-01 12:34:17.06315259 +0000 UTC m=+9304.571941667" observedRunningTime="2025-12-01 12:34:17.60829443 +0000 UTC m=+9305.117083477" watchObservedRunningTime="2025-12-01 12:34:28.097679471 +0000 UTC m=+9315.606468528" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.121920 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lcsxv" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.203527 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lcsxv"] Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.311598 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.312753 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z2r66" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="registry-server" containerID="cri-o://c076ffd3f5b7cb4fe0b74c0ef17a568c6d1d1f1579ae1d6975d5daaf31c1924b" gracePeriod=2 Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.725159 4958 generic.go:334] "Generic (PLEG): container finished" podID="44224579-1880-4630-8609-f8fb6ab8cb92" containerID="c076ffd3f5b7cb4fe0b74c0ef17a568c6d1d1f1579ae1d6975d5daaf31c1924b" exitCode=0 Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.725218 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerDied","Data":"c076ffd3f5b7cb4fe0b74c0ef17a568c6d1d1f1579ae1d6975d5daaf31c1924b"} Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.885197 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2r66" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.976264 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-catalog-content\") pod \"44224579-1880-4630-8609-f8fb6ab8cb92\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.976392 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-utilities\") pod \"44224579-1880-4630-8609-f8fb6ab8cb92\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.981881 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4f77\" (UniqueName: \"kubernetes.io/projected/44224579-1880-4630-8609-f8fb6ab8cb92-kube-api-access-x4f77\") pod \"44224579-1880-4630-8609-f8fb6ab8cb92\" (UID: \"44224579-1880-4630-8609-f8fb6ab8cb92\") " Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.982302 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-utilities" (OuterVolumeSpecName: "utilities") pod "44224579-1880-4630-8609-f8fb6ab8cb92" (UID: "44224579-1880-4630-8609-f8fb6ab8cb92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.983157 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:34:28 crc kubenswrapper[4958]: I1201 12:34:28.992704 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44224579-1880-4630-8609-f8fb6ab8cb92-kube-api-access-x4f77" (OuterVolumeSpecName: "kube-api-access-x4f77") pod "44224579-1880-4630-8609-f8fb6ab8cb92" (UID: "44224579-1880-4630-8609-f8fb6ab8cb92"). InnerVolumeSpecName "kube-api-access-x4f77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.033746 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44224579-1880-4630-8609-f8fb6ab8cb92" (UID: "44224579-1880-4630-8609-f8fb6ab8cb92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.085306 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44224579-1880-4630-8609-f8fb6ab8cb92-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.085350 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4f77\" (UniqueName: \"kubernetes.io/projected/44224579-1880-4630-8609-f8fb6ab8cb92-kube-api-access-x4f77\") on node \"crc\" DevicePath \"\"" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.739599 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z2r66" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.739606 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z2r66" event={"ID":"44224579-1880-4630-8609-f8fb6ab8cb92","Type":"ContainerDied","Data":"f245fbf9419b1cc58b8eb84a3565046b29691a82216733adafe08f0b95abbad8"} Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.739728 4958 scope.go:117] "RemoveContainer" containerID="c076ffd3f5b7cb4fe0b74c0ef17a568c6d1d1f1579ae1d6975d5daaf31c1924b" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.786678 4958 scope.go:117] "RemoveContainer" containerID="567553f8a6a139dc32820fdece06ddebb736e5085742a1421bc951d7bedbc4d3" Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.795749 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.825168 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z2r66"] Dec 01 12:34:29 crc kubenswrapper[4958]: I1201 12:34:29.833733 4958 scope.go:117] "RemoveContainer" containerID="4987a0ddd8f90e91559513a2212dd3a2cba6ed86cd94952665f698e7050140f5" Dec 01 12:34:31 crc kubenswrapper[4958]: I1201 12:34:31.813306 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" path="/var/lib/kubelet/pods/44224579-1880-4630-8609-f8fb6ab8cb92/volumes" Dec 01 12:34:37 crc kubenswrapper[4958]: I1201 12:34:37.798970 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:34:37 crc kubenswrapper[4958]: E1201 12:34:37.800261 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:34:52 crc kubenswrapper[4958]: I1201 12:34:52.797982 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:34:52 crc kubenswrapper[4958]: E1201 12:34:52.799133 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:35:05 crc kubenswrapper[4958]: I1201 12:35:05.797876 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:35:05 crc kubenswrapper[4958]: E1201 12:35:05.798678 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:35:16 crc kubenswrapper[4958]: I1201 12:35:16.798223 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:35:16 crc kubenswrapper[4958]: E1201 12:35:16.799002 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:35:31 crc kubenswrapper[4958]: I1201 12:35:31.798604 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:35:31 crc kubenswrapper[4958]: E1201 12:35:31.799502 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:35:45 crc kubenswrapper[4958]: I1201 12:35:45.798279 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:35:45 crc kubenswrapper[4958]: E1201 12:35:45.798978 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:35:48 crc kubenswrapper[4958]: I1201 12:35:48.081872 4958 trace.go:236] Trace[1173944610]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-2" (01-Dec-2025 12:35:47.003) (total time: 1078ms): Dec 01 12:35:48 crc kubenswrapper[4958]: Trace[1173944610]: [1.078634664s] [1.078634664s] END Dec 01 12:35:56 crc kubenswrapper[4958]: I1201 12:35:56.797932 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:35:56 crc kubenswrapper[4958]: E1201 12:35:56.799095 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:36:08 crc kubenswrapper[4958]: I1201 12:36:08.798291 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:36:08 crc kubenswrapper[4958]: E1201 12:36:08.799342 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:36:21 crc kubenswrapper[4958]: I1201 12:36:21.797944 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:36:21 crc kubenswrapper[4958]: E1201 12:36:21.798942 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:36:35 crc kubenswrapper[4958]: I1201 12:36:35.804125 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:36:35 crc kubenswrapper[4958]: E1201 12:36:35.805070 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:36:40 crc kubenswrapper[4958]: I1201 12:36:40.709125 4958 generic.go:334] "Generic (PLEG): container finished" podID="a688b48f-67dc-4fce-a92f-f7bf69d1ad29" containerID="e25e99b8990978c8ec5e6bf72b9c1f37c78b81e4e41475584c4cf9b37360ec62" exitCode=0 Dec 01 12:36:40 crc kubenswrapper[4958]: I1201 12:36:40.709215 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" event={"ID":"a688b48f-67dc-4fce-a92f-f7bf69d1ad29","Type":"ContainerDied","Data":"e25e99b8990978c8ec5e6bf72b9c1f37c78b81e4e41475584c4cf9b37360ec62"} Dec 01 12:36:42 crc kubenswrapper[4958]: I1201 12:36:42.866980 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.045626 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-agent-neutron-config-0\") pod \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.045868 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-inventory\") pod \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.045929 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58m56\" (UniqueName: \"kubernetes.io/projected/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-kube-api-access-58m56\") pod \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.045981 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-combined-ca-bundle\") pod \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.046065 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ceph\") pod \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.046217 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ssh-key\") pod \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\" (UID: \"a688b48f-67dc-4fce-a92f-f7bf69d1ad29\") " Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.055293 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-kube-api-access-58m56" (OuterVolumeSpecName: "kube-api-access-58m56") pod "a688b48f-67dc-4fce-a92f-f7bf69d1ad29" (UID: "a688b48f-67dc-4fce-a92f-f7bf69d1ad29"). InnerVolumeSpecName "kube-api-access-58m56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.056306 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "a688b48f-67dc-4fce-a92f-f7bf69d1ad29" (UID: "a688b48f-67dc-4fce-a92f-f7bf69d1ad29"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.057142 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ceph" (OuterVolumeSpecName: "ceph") pod "a688b48f-67dc-4fce-a92f-f7bf69d1ad29" (UID: "a688b48f-67dc-4fce-a92f-f7bf69d1ad29"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.108513 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-inventory" (OuterVolumeSpecName: "inventory") pod "a688b48f-67dc-4fce-a92f-f7bf69d1ad29" (UID: "a688b48f-67dc-4fce-a92f-f7bf69d1ad29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.152047 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.152097 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58m56\" (UniqueName: \"kubernetes.io/projected/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-kube-api-access-58m56\") on node \"crc\" DevicePath \"\"" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.152113 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.152132 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.198120 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "a688b48f-67dc-4fce-a92f-f7bf69d1ad29" (UID: "a688b48f-67dc-4fce-a92f-f7bf69d1ad29"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.198246 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a688b48f-67dc-4fce-a92f-f7bf69d1ad29" (UID: "a688b48f-67dc-4fce-a92f-f7bf69d1ad29"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.254666 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.254693 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a688b48f-67dc-4fce-a92f-f7bf69d1ad29-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.742361 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" event={"ID":"a688b48f-67dc-4fce-a92f-f7bf69d1ad29","Type":"ContainerDied","Data":"01dc80fc7088dcbb072cffebdebf0c442bf1a4d6f99e91b339e165cc34e4da55"} Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.742423 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01dc80fc7088dcbb072cffebdebf0c442bf1a4d6f99e91b339e165cc34e4da55" Dec 01 12:36:43 crc kubenswrapper[4958]: I1201 12:36:43.742424 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-d49jb" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.279836 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-d79c5"] Dec 01 12:36:44 crc kubenswrapper[4958]: E1201 12:36:44.280804 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="extract-content" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.280830 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="extract-content" Dec 01 12:36:44 crc kubenswrapper[4958]: E1201 12:36:44.280845 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a688b48f-67dc-4fce-a92f-f7bf69d1ad29" containerName="neutron-sriov-openstack-openstack-cell1" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.280889 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a688b48f-67dc-4fce-a92f-f7bf69d1ad29" containerName="neutron-sriov-openstack-openstack-cell1" Dec 01 12:36:44 crc kubenswrapper[4958]: E1201 12:36:44.280920 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="extract-utilities" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.280928 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="extract-utilities" Dec 01 12:36:44 crc kubenswrapper[4958]: E1201 12:36:44.280966 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="registry-server" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.280976 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="registry-server" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.281259 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="44224579-1880-4630-8609-f8fb6ab8cb92" containerName="registry-server" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.281292 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a688b48f-67dc-4fce-a92f-f7bf69d1ad29" containerName="neutron-sriov-openstack-openstack-cell1" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.282383 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.286324 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.286528 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.286671 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.287252 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.288396 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.309835 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-d79c5"] Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.434216 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.434286 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.434732 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.434833 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.434898 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5fpt\" (UniqueName: \"kubernetes.io/projected/979164fd-9f74-4647-b801-5134de28d7f4-kube-api-access-r5fpt\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.434971 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.537682 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.537750 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.537776 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.537826 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.537876 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5fpt\" (UniqueName: \"kubernetes.io/projected/979164fd-9f74-4647-b801-5134de28d7f4-kube-api-access-r5fpt\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.537918 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.965442 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.969471 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.969576 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.972538 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.976397 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:44 crc kubenswrapper[4958]: I1201 12:36:44.989995 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5fpt\" (UniqueName: \"kubernetes.io/projected/979164fd-9f74-4647-b801-5134de28d7f4-kube-api-access-r5fpt\") pod \"neutron-dhcp-openstack-openstack-cell1-d79c5\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:45 crc kubenswrapper[4958]: I1201 12:36:45.217482 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:36:45 crc kubenswrapper[4958]: I1201 12:36:45.811914 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-d79c5"] Dec 01 12:36:46 crc kubenswrapper[4958]: I1201 12:36:46.777888 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" event={"ID":"979164fd-9f74-4647-b801-5134de28d7f4","Type":"ContainerStarted","Data":"c7558c1d3093435d18e40523a75220d93136a52ac40aace29fa021ab2a2b7aba"} Dec 01 12:36:46 crc kubenswrapper[4958]: I1201 12:36:46.778418 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" event={"ID":"979164fd-9f74-4647-b801-5134de28d7f4","Type":"ContainerStarted","Data":"0d5759c573e08bd87a8a195f820869037bf3a1b4e1759cb335aedb0ff7d347a6"} Dec 01 12:36:46 crc kubenswrapper[4958]: I1201 12:36:46.806772 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" podStartSLOduration=2.627727806 podStartE2EDuration="2.806751574s" podCreationTimestamp="2025-12-01 12:36:44 +0000 UTC" firstStartedPulling="2025-12-01 12:36:45.834832653 +0000 UTC m=+9453.343621690" lastFinishedPulling="2025-12-01 12:36:46.013856421 +0000 UTC m=+9453.522645458" observedRunningTime="2025-12-01 12:36:46.797171295 +0000 UTC m=+9454.305960352" watchObservedRunningTime="2025-12-01 12:36:46.806751574 +0000 UTC m=+9454.315540611" Dec 01 12:36:48 crc kubenswrapper[4958]: I1201 12:36:48.797598 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:36:48 crc kubenswrapper[4958]: E1201 12:36:48.798197 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:36:59 crc kubenswrapper[4958]: I1201 12:36:59.797936 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:37:00 crc kubenswrapper[4958]: I1201 12:37:00.965039 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"5de11fd4226fa13b8a6c00a4614611530deaaa072533bcdc378d32d5b9707f26"} Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.645553 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m86bw"] Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.654800 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.658821 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m86bw"] Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.687913 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-utilities\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.689046 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-catalog-content\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.689078 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpccz\" (UniqueName: \"kubernetes.io/projected/f27c21af-d68d-407b-bcc1-cbee1b78a319-kube-api-access-tpccz\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.791375 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-utilities\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.791610 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpccz\" (UniqueName: \"kubernetes.io/projected/f27c21af-d68d-407b-bcc1-cbee1b78a319-kube-api-access-tpccz\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.791635 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-catalog-content\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.792401 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-utilities\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.792492 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-catalog-content\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.812377 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpccz\" (UniqueName: \"kubernetes.io/projected/f27c21af-d68d-407b-bcc1-cbee1b78a319-kube-api-access-tpccz\") pod \"certified-operators-m86bw\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:20 crc kubenswrapper[4958]: I1201 12:39:20.991295 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:21 crc kubenswrapper[4958]: I1201 12:39:21.586384 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m86bw"] Dec 01 12:39:22 crc kubenswrapper[4958]: I1201 12:39:22.189785 4958 generic.go:334] "Generic (PLEG): container finished" podID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerID="3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f" exitCode=0 Dec 01 12:39:22 crc kubenswrapper[4958]: I1201 12:39:22.189837 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerDied","Data":"3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f"} Dec 01 12:39:22 crc kubenswrapper[4958]: I1201 12:39:22.190226 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerStarted","Data":"dfb72528ea2adb0d974a868a902f5cdb04f5079a32b1deb9eda9f7da51c2fd74"} Dec 01 12:39:22 crc kubenswrapper[4958]: I1201 12:39:22.194434 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.621883 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mxrzr"] Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.627711 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.646054 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxrzr"] Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.660455 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-utilities\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.660653 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qqn\" (UniqueName: \"kubernetes.io/projected/87258260-e29f-4b82-bb9a-aef60e371e18-kube-api-access-j7qqn\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.660710 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-catalog-content\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.765933 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qqn\" (UniqueName: \"kubernetes.io/projected/87258260-e29f-4b82-bb9a-aef60e371e18-kube-api-access-j7qqn\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.766093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-catalog-content\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.766388 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-utilities\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.767264 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-utilities\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.768059 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-catalog-content\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.807874 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qqn\" (UniqueName: \"kubernetes.io/projected/87258260-e29f-4b82-bb9a-aef60e371e18-kube-api-access-j7qqn\") pod \"redhat-operators-mxrzr\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:23 crc kubenswrapper[4958]: I1201 12:39:23.991005 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:24 crc kubenswrapper[4958]: I1201 12:39:24.215103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerStarted","Data":"ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397"} Dec 01 12:39:25 crc kubenswrapper[4958]: I1201 12:39:25.254622 4958 generic.go:334] "Generic (PLEG): container finished" podID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerID="ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397" exitCode=0 Dec 01 12:39:25 crc kubenswrapper[4958]: I1201 12:39:25.254923 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerDied","Data":"ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397"} Dec 01 12:39:25 crc kubenswrapper[4958]: I1201 12:39:25.268317 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxrzr"] Dec 01 12:39:25 crc kubenswrapper[4958]: W1201 12:39:25.279568 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87258260_e29f_4b82_bb9a_aef60e371e18.slice/crio-b0f71b3f8421c59dfdb29f11775b070daf54f52859ffddaa1dfc9f0af7c4c374 WatchSource:0}: Error finding container b0f71b3f8421c59dfdb29f11775b070daf54f52859ffddaa1dfc9f0af7c4c374: Status 404 returned error can't find the container with id b0f71b3f8421c59dfdb29f11775b070daf54f52859ffddaa1dfc9f0af7c4c374 Dec 01 12:39:26 crc kubenswrapper[4958]: I1201 12:39:26.272112 4958 generic.go:334] "Generic (PLEG): container finished" podID="87258260-e29f-4b82-bb9a-aef60e371e18" containerID="e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f" exitCode=0 Dec 01 12:39:26 crc kubenswrapper[4958]: I1201 12:39:26.272249 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerDied","Data":"e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f"} Dec 01 12:39:26 crc kubenswrapper[4958]: I1201 12:39:26.272493 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerStarted","Data":"b0f71b3f8421c59dfdb29f11775b070daf54f52859ffddaa1dfc9f0af7c4c374"} Dec 01 12:39:27 crc kubenswrapper[4958]: I1201 12:39:27.288314 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerStarted","Data":"17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b"} Dec 01 12:39:27 crc kubenswrapper[4958]: I1201 12:39:27.301050 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerStarted","Data":"da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa"} Dec 01 12:39:27 crc kubenswrapper[4958]: I1201 12:39:27.347336 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m86bw" podStartSLOduration=3.258438002 podStartE2EDuration="7.347315439s" podCreationTimestamp="2025-12-01 12:39:20 +0000 UTC" firstStartedPulling="2025-12-01 12:39:22.194230513 +0000 UTC m=+9609.703019550" lastFinishedPulling="2025-12-01 12:39:26.28310795 +0000 UTC m=+9613.791896987" observedRunningTime="2025-12-01 12:39:27.344737416 +0000 UTC m=+9614.853526443" watchObservedRunningTime="2025-12-01 12:39:27.347315439 +0000 UTC m=+9614.856104476" Dec 01 12:39:28 crc kubenswrapper[4958]: I1201 12:39:28.210775 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:39:28 crc kubenswrapper[4958]: I1201 12:39:28.211360 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:39:30 crc kubenswrapper[4958]: I1201 12:39:30.991511 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:30 crc kubenswrapper[4958]: I1201 12:39:30.991918 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:31 crc kubenswrapper[4958]: I1201 12:39:31.361817 4958 generic.go:334] "Generic (PLEG): container finished" podID="87258260-e29f-4b82-bb9a-aef60e371e18" containerID="17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b" exitCode=0 Dec 01 12:39:31 crc kubenswrapper[4958]: I1201 12:39:31.361919 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerDied","Data":"17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b"} Dec 01 12:39:32 crc kubenswrapper[4958]: I1201 12:39:32.061445 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-m86bw" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="registry-server" probeResult="failure" output=< Dec 01 12:39:32 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:39:32 crc kubenswrapper[4958]: > Dec 01 12:39:32 crc kubenswrapper[4958]: I1201 12:39:32.375424 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerStarted","Data":"0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21"} Dec 01 12:39:32 crc kubenswrapper[4958]: I1201 12:39:32.415869 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mxrzr" podStartSLOduration=3.921166655 podStartE2EDuration="9.415627502s" podCreationTimestamp="2025-12-01 12:39:23 +0000 UTC" firstStartedPulling="2025-12-01 12:39:26.282069521 +0000 UTC m=+9613.790858588" lastFinishedPulling="2025-12-01 12:39:31.776530368 +0000 UTC m=+9619.285319435" observedRunningTime="2025-12-01 12:39:32.397015739 +0000 UTC m=+9619.905804796" watchObservedRunningTime="2025-12-01 12:39:32.415627502 +0000 UTC m=+9619.924416539" Dec 01 12:39:33 crc kubenswrapper[4958]: I1201 12:39:33.991813 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:33 crc kubenswrapper[4958]: I1201 12:39:33.992221 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:35 crc kubenswrapper[4958]: I1201 12:39:35.040776 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mxrzr" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="registry-server" probeResult="failure" output=< Dec 01 12:39:35 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:39:35 crc kubenswrapper[4958]: > Dec 01 12:39:41 crc kubenswrapper[4958]: I1201 12:39:41.080264 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:41 crc kubenswrapper[4958]: I1201 12:39:41.165168 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:42 crc kubenswrapper[4958]: I1201 12:39:42.630916 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m86bw"] Dec 01 12:39:42 crc kubenswrapper[4958]: I1201 12:39:42.631908 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m86bw" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="registry-server" containerID="cri-o://da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa" gracePeriod=2 Dec 01 12:39:42 crc kubenswrapper[4958]: E1201 12:39:42.904441 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27c21af_d68d_407b_bcc1_cbee1b78a319.slice/crio-conmon-da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27c21af_d68d_407b_bcc1_cbee1b78a319.slice/crio-da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa.scope\": RecentStats: unable to find data in memory cache]" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.222608 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.251343 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpccz\" (UniqueName: \"kubernetes.io/projected/f27c21af-d68d-407b-bcc1-cbee1b78a319-kube-api-access-tpccz\") pod \"f27c21af-d68d-407b-bcc1-cbee1b78a319\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.251402 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-utilities\") pod \"f27c21af-d68d-407b-bcc1-cbee1b78a319\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.251542 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-catalog-content\") pod \"f27c21af-d68d-407b-bcc1-cbee1b78a319\" (UID: \"f27c21af-d68d-407b-bcc1-cbee1b78a319\") " Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.254732 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-utilities" (OuterVolumeSpecName: "utilities") pod "f27c21af-d68d-407b-bcc1-cbee1b78a319" (UID: "f27c21af-d68d-407b-bcc1-cbee1b78a319"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.268061 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27c21af-d68d-407b-bcc1-cbee1b78a319-kube-api-access-tpccz" (OuterVolumeSpecName: "kube-api-access-tpccz") pod "f27c21af-d68d-407b-bcc1-cbee1b78a319" (UID: "f27c21af-d68d-407b-bcc1-cbee1b78a319"). InnerVolumeSpecName "kube-api-access-tpccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.324676 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f27c21af-d68d-407b-bcc1-cbee1b78a319" (UID: "f27c21af-d68d-407b-bcc1-cbee1b78a319"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.353165 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.353194 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpccz\" (UniqueName: \"kubernetes.io/projected/f27c21af-d68d-407b-bcc1-cbee1b78a319-kube-api-access-tpccz\") on node \"crc\" DevicePath \"\"" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.353204 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f27c21af-d68d-407b-bcc1-cbee1b78a319-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.535391 4958 generic.go:334] "Generic (PLEG): container finished" podID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerID="da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa" exitCode=0 Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.535457 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerDied","Data":"da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa"} Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.535500 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m86bw" event={"ID":"f27c21af-d68d-407b-bcc1-cbee1b78a319","Type":"ContainerDied","Data":"dfb72528ea2adb0d974a868a902f5cdb04f5079a32b1deb9eda9f7da51c2fd74"} Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.535501 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m86bw" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.535525 4958 scope.go:117] "RemoveContainer" containerID="da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.586211 4958 scope.go:117] "RemoveContainer" containerID="ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.589478 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m86bw"] Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.603117 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m86bw"] Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.615561 4958 scope.go:117] "RemoveContainer" containerID="3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.684082 4958 scope.go:117] "RemoveContainer" containerID="da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa" Dec 01 12:39:43 crc kubenswrapper[4958]: E1201 12:39:43.685238 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa\": container with ID starting with da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa not found: ID does not exist" containerID="da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.685307 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa"} err="failed to get container status \"da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa\": rpc error: code = NotFound desc = could not find container \"da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa\": container with ID starting with da56f9b6385a19d0c7be81b154720bd734c401ecc16b6dcc6df5e091b73ab4aa not found: ID does not exist" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.685341 4958 scope.go:117] "RemoveContainer" containerID="ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397" Dec 01 12:39:43 crc kubenswrapper[4958]: E1201 12:39:43.685739 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397\": container with ID starting with ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397 not found: ID does not exist" containerID="ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.685771 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397"} err="failed to get container status \"ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397\": rpc error: code = NotFound desc = could not find container \"ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397\": container with ID starting with ac816bb6e662a725eee25283680c35c38b056d65f8b1ab6ec4a86bd06e2a3397 not found: ID does not exist" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.685789 4958 scope.go:117] "RemoveContainer" containerID="3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f" Dec 01 12:39:43 crc kubenswrapper[4958]: E1201 12:39:43.686167 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f\": container with ID starting with 3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f not found: ID does not exist" containerID="3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.686212 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f"} err="failed to get container status \"3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f\": rpc error: code = NotFound desc = could not find container \"3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f\": container with ID starting with 3d394c6f4f3a38fd942e4cfe86e85c4ec4121bbfa4bff0f0a647e3de5eb1557f not found: ID does not exist" Dec 01 12:39:43 crc kubenswrapper[4958]: I1201 12:39:43.818165 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" path="/var/lib/kubelet/pods/f27c21af-d68d-407b-bcc1-cbee1b78a319/volumes" Dec 01 12:39:44 crc kubenswrapper[4958]: I1201 12:39:44.086898 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:44 crc kubenswrapper[4958]: I1201 12:39:44.187426 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:45 crc kubenswrapper[4958]: I1201 12:39:45.824073 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mxrzr"] Dec 01 12:39:45 crc kubenswrapper[4958]: I1201 12:39:45.824614 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mxrzr" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="registry-server" containerID="cri-o://0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21" gracePeriod=2 Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.395905 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.447326 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-catalog-content\") pod \"87258260-e29f-4b82-bb9a-aef60e371e18\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.447668 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7qqn\" (UniqueName: \"kubernetes.io/projected/87258260-e29f-4b82-bb9a-aef60e371e18-kube-api-access-j7qqn\") pod \"87258260-e29f-4b82-bb9a-aef60e371e18\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.448363 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-utilities\") pod \"87258260-e29f-4b82-bb9a-aef60e371e18\" (UID: \"87258260-e29f-4b82-bb9a-aef60e371e18\") " Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.450512 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-utilities" (OuterVolumeSpecName: "utilities") pod "87258260-e29f-4b82-bb9a-aef60e371e18" (UID: "87258260-e29f-4b82-bb9a-aef60e371e18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.461541 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87258260-e29f-4b82-bb9a-aef60e371e18-kube-api-access-j7qqn" (OuterVolumeSpecName: "kube-api-access-j7qqn") pod "87258260-e29f-4b82-bb9a-aef60e371e18" (UID: "87258260-e29f-4b82-bb9a-aef60e371e18"). InnerVolumeSpecName "kube-api-access-j7qqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.552373 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.552433 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7qqn\" (UniqueName: \"kubernetes.io/projected/87258260-e29f-4b82-bb9a-aef60e371e18-kube-api-access-j7qqn\") on node \"crc\" DevicePath \"\"" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.586514 4958 generic.go:334] "Generic (PLEG): container finished" podID="87258260-e29f-4b82-bb9a-aef60e371e18" containerID="0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21" exitCode=0 Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.586571 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerDied","Data":"0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21"} Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.586612 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxrzr" event={"ID":"87258260-e29f-4b82-bb9a-aef60e371e18","Type":"ContainerDied","Data":"b0f71b3f8421c59dfdb29f11775b070daf54f52859ffddaa1dfc9f0af7c4c374"} Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.586634 4958 scope.go:117] "RemoveContainer" containerID="0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.586788 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxrzr" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.598004 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87258260-e29f-4b82-bb9a-aef60e371e18" (UID: "87258260-e29f-4b82-bb9a-aef60e371e18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.618800 4958 scope.go:117] "RemoveContainer" containerID="17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.649863 4958 scope.go:117] "RemoveContainer" containerID="e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.653288 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87258260-e29f-4b82-bb9a-aef60e371e18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.694963 4958 scope.go:117] "RemoveContainer" containerID="0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21" Dec 01 12:39:46 crc kubenswrapper[4958]: E1201 12:39:46.695619 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21\": container with ID starting with 0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21 not found: ID does not exist" containerID="0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.695675 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21"} err="failed to get container status \"0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21\": rpc error: code = NotFound desc = could not find container \"0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21\": container with ID starting with 0e20fec8d4e1c40a948750df8f359fa3dcd765c5d0cb7482f3306e24be69da21 not found: ID does not exist" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.695712 4958 scope.go:117] "RemoveContainer" containerID="17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b" Dec 01 12:39:46 crc kubenswrapper[4958]: E1201 12:39:46.696368 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b\": container with ID starting with 17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b not found: ID does not exist" containerID="17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.696436 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b"} err="failed to get container status \"17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b\": rpc error: code = NotFound desc = could not find container \"17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b\": container with ID starting with 17fe5a8cf835ad94dd44d11980785a8ddf02cbdee39892094c4aeaef3167b15b not found: ID does not exist" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.696460 4958 scope.go:117] "RemoveContainer" containerID="e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f" Dec 01 12:39:46 crc kubenswrapper[4958]: E1201 12:39:46.696932 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f\": container with ID starting with e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f not found: ID does not exist" containerID="e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.697305 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f"} err="failed to get container status \"e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f\": rpc error: code = NotFound desc = could not find container \"e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f\": container with ID starting with e5d149c432ee9a5b9db386367f0cfb345e0679ecd0e50c63e2083de0b275d60f not found: ID does not exist" Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.932508 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mxrzr"] Dec 01 12:39:46 crc kubenswrapper[4958]: I1201 12:39:46.943551 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mxrzr"] Dec 01 12:39:47 crc kubenswrapper[4958]: I1201 12:39:47.825181 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" path="/var/lib/kubelet/pods/87258260-e29f-4b82-bb9a-aef60e371e18/volumes" Dec 01 12:39:58 crc kubenswrapper[4958]: I1201 12:39:58.210281 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:39:58 crc kubenswrapper[4958]: I1201 12:39:58.211088 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:40:24 crc kubenswrapper[4958]: I1201 12:40:24.196056 4958 generic.go:334] "Generic (PLEG): container finished" podID="979164fd-9f74-4647-b801-5134de28d7f4" containerID="c7558c1d3093435d18e40523a75220d93136a52ac40aace29fa021ab2a2b7aba" exitCode=0 Dec 01 12:40:24 crc kubenswrapper[4958]: I1201 12:40:24.196131 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" event={"ID":"979164fd-9f74-4647-b801-5134de28d7f4","Type":"ContainerDied","Data":"c7558c1d3093435d18e40523a75220d93136a52ac40aace29fa021ab2a2b7aba"} Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.823525 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.914729 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-agent-neutron-config-0\") pod \"979164fd-9f74-4647-b801-5134de28d7f4\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.914811 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ssh-key\") pod \"979164fd-9f74-4647-b801-5134de28d7f4\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.914952 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-inventory\") pod \"979164fd-9f74-4647-b801-5134de28d7f4\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.914979 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ceph\") pod \"979164fd-9f74-4647-b801-5134de28d7f4\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.915041 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5fpt\" (UniqueName: \"kubernetes.io/projected/979164fd-9f74-4647-b801-5134de28d7f4-kube-api-access-r5fpt\") pod \"979164fd-9f74-4647-b801-5134de28d7f4\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.915133 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-combined-ca-bundle\") pod \"979164fd-9f74-4647-b801-5134de28d7f4\" (UID: \"979164fd-9f74-4647-b801-5134de28d7f4\") " Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.924110 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ceph" (OuterVolumeSpecName: "ceph") pod "979164fd-9f74-4647-b801-5134de28d7f4" (UID: "979164fd-9f74-4647-b801-5134de28d7f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.924238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "979164fd-9f74-4647-b801-5134de28d7f4" (UID: "979164fd-9f74-4647-b801-5134de28d7f4"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:26 crc kubenswrapper[4958]: I1201 12:40:26.925458 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979164fd-9f74-4647-b801-5134de28d7f4-kube-api-access-r5fpt" (OuterVolumeSpecName: "kube-api-access-r5fpt") pod "979164fd-9f74-4647-b801-5134de28d7f4" (UID: "979164fd-9f74-4647-b801-5134de28d7f4"). InnerVolumeSpecName "kube-api-access-r5fpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.019937 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.019984 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5fpt\" (UniqueName: \"kubernetes.io/projected/979164fd-9f74-4647-b801-5134de28d7f4-kube-api-access-r5fpt\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.019997 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.068276 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "979164fd-9f74-4647-b801-5134de28d7f4" (UID: "979164fd-9f74-4647-b801-5134de28d7f4"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.068314 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "979164fd-9f74-4647-b801-5134de28d7f4" (UID: "979164fd-9f74-4647-b801-5134de28d7f4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.069043 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-inventory" (OuterVolumeSpecName: "inventory") pod "979164fd-9f74-4647-b801-5134de28d7f4" (UID: "979164fd-9f74-4647-b801-5134de28d7f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.122313 4958 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.122365 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.122376 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/979164fd-9f74-4647-b801-5134de28d7f4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.247786 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" event={"ID":"979164fd-9f74-4647-b801-5134de28d7f4","Type":"ContainerDied","Data":"0d5759c573e08bd87a8a195f820869037bf3a1b4e1759cb335aedb0ff7d347a6"} Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.247836 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d5759c573e08bd87a8a195f820869037bf3a1b4e1759cb335aedb0ff7d347a6" Dec 01 12:40:27 crc kubenswrapper[4958]: I1201 12:40:27.247881 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d79c5" Dec 01 12:40:28 crc kubenswrapper[4958]: I1201 12:40:28.211086 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:40:28 crc kubenswrapper[4958]: I1201 12:40:28.211525 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:40:28 crc kubenswrapper[4958]: I1201 12:40:28.211599 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:40:28 crc kubenswrapper[4958]: I1201 12:40:28.213028 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5de11fd4226fa13b8a6c00a4614611530deaaa072533bcdc378d32d5b9707f26"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:40:28 crc kubenswrapper[4958]: I1201 12:40:28.213182 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://5de11fd4226fa13b8a6c00a4614611530deaaa072533bcdc378d32d5b9707f26" gracePeriod=600 Dec 01 12:40:29 crc kubenswrapper[4958]: I1201 12:40:29.363370 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="5de11fd4226fa13b8a6c00a4614611530deaaa072533bcdc378d32d5b9707f26" exitCode=0 Dec 01 12:40:29 crc kubenswrapper[4958]: I1201 12:40:29.363505 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"5de11fd4226fa13b8a6c00a4614611530deaaa072533bcdc378d32d5b9707f26"} Dec 01 12:40:29 crc kubenswrapper[4958]: I1201 12:40:29.364134 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d"} Dec 01 12:40:29 crc kubenswrapper[4958]: I1201 12:40:29.364164 4958 scope.go:117] "RemoveContainer" containerID="f5f1efa15428fa16d9d25d7e7f2e77c6609690be249cc277fbfbe4242b24e1d6" Dec 01 12:40:37 crc kubenswrapper[4958]: I1201 12:40:37.169795 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 12:40:37 crc kubenswrapper[4958]: I1201 12:40:37.171124 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" gracePeriod=30 Dec 01 12:40:37 crc kubenswrapper[4958]: I1201 12:40:37.229697 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 12:40:37 crc kubenswrapper[4958]: I1201 12:40:37.229960 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" containerName="nova-cell1-conductor-conductor" containerID="cri-o://45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" gracePeriod=30 Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.218020 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b is running failed: container process not found" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.227985 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b is running failed: container process not found" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.229755 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b is running failed: container process not found" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.229888 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" containerName="nova-cell1-conductor-conductor" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.259331 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.259918 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-log" containerID="cri-o://731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a" gracePeriod=30 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.260882 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-api" containerID="cri-o://4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3" gracePeriod=30 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.349969 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.375338 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.375584 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-log" containerID="cri-o://45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0" gracePeriod=30 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.375716 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-metadata" containerID="cri-o://efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e" gracePeriod=30 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.384104 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/nova-metadata-0" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": EOF" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.475710 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.486961 4958 generic.go:334] "Generic (PLEG): container finished" podID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerID="731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a" exitCode=143 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.487045 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c1fa21c-37dc-47a2-82f3-b68e89651d04","Type":"ContainerDied","Data":"731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a"} Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.490344 4958 generic.go:334] "Generic (PLEG): container finished" podID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" exitCode=0 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.490539 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerName="nova-scheduler-scheduler" containerID="cri-o://79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" gracePeriod=30 Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.490715 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01d20bd7-d045-43e1-a954-a7ef8d11b3d7","Type":"ContainerDied","Data":"45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b"} Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.490780 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"01d20bd7-d045-43e1-a954-a7ef8d11b3d7","Type":"ContainerDied","Data":"a572204080969969f9ea29bbd103e34181aeded0137f718627735ccc51093619"} Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.490802 4958 scope.go:117] "RemoveContainer" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.490881 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.505001 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-combined-ca-bundle\") pod \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.505275 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxzc4\" (UniqueName: \"kubernetes.io/projected/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-kube-api-access-lxzc4\") pod \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.505337 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-config-data\") pod \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\" (UID: \"01d20bd7-d045-43e1-a954-a7ef8d11b3d7\") " Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.519200 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-kube-api-access-lxzc4" (OuterVolumeSpecName: "kube-api-access-lxzc4") pod "01d20bd7-d045-43e1-a954-a7ef8d11b3d7" (UID: "01d20bd7-d045-43e1-a954-a7ef8d11b3d7"). InnerVolumeSpecName "kube-api-access-lxzc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.545733 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01d20bd7-d045-43e1-a954-a7ef8d11b3d7" (UID: "01d20bd7-d045-43e1-a954-a7ef8d11b3d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.599148 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-config-data" (OuterVolumeSpecName: "config-data") pod "01d20bd7-d045-43e1-a954-a7ef8d11b3d7" (UID: "01d20bd7-d045-43e1-a954-a7ef8d11b3d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.609978 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxzc4\" (UniqueName: \"kubernetes.io/projected/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-kube-api-access-lxzc4\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.610027 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.610040 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d20bd7-d045-43e1-a954-a7ef8d11b3d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.631678 4958 scope.go:117] "RemoveContainer" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.632285 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b\": container with ID starting with 45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b not found: ID does not exist" containerID="45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.632324 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b"} err="failed to get container status \"45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b\": rpc error: code = NotFound desc = could not find container \"45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b\": container with ID starting with 45304b4433bfdac4889084d0ebe44f20f6fdeb441e95e866651a456e8262476b not found: ID does not exist" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.840930 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.852667 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.949396 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950241 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="extract-utilities" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950261 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="extract-utilities" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950309 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="registry-server" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950317 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="registry-server" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950326 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="extract-content" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950332 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="extract-content" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950347 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" containerName="nova-cell1-conductor-conductor" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950353 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" containerName="nova-cell1-conductor-conductor" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950430 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="registry-server" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950437 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="registry-server" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950542 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="extract-content" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950549 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="extract-content" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950571 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979164fd-9f74-4647-b801-5134de28d7f4" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950614 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="979164fd-9f74-4647-b801-5134de28d7f4" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 01 12:40:38 crc kubenswrapper[4958]: E1201 12:40:38.950638 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="extract-utilities" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.950644 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="extract-utilities" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.951035 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="87258260-e29f-4b82-bb9a-aef60e371e18" containerName="registry-server" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.951052 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="979164fd-9f74-4647-b801-5134de28d7f4" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.951062 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27c21af-d68d-407b-bcc1-cbee1b78a319" containerName="registry-server" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.951108 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" containerName="nova-cell1-conductor-conductor" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.952326 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.956861 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 12:40:38 crc kubenswrapper[4958]: I1201 12:40:38.957903 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.049326 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv9w7\" (UniqueName: \"kubernetes.io/projected/038077d4-978f-40be-8216-70603ae81f5d-kube-api-access-xv9w7\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.049499 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038077d4-978f-40be-8216-70603ae81f5d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.049656 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038077d4-978f-40be-8216-70603ae81f5d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: E1201 12:40:39.077664 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc is running failed: container process not found" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 12:40:39 crc kubenswrapper[4958]: E1201 12:40:39.081330 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc is running failed: container process not found" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 12:40:39 crc kubenswrapper[4958]: E1201 12:40:39.083089 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc is running failed: container process not found" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 01 12:40:39 crc kubenswrapper[4958]: E1201 12:40:39.083180 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" containerName="nova-cell0-conductor-conductor" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.154164 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv9w7\" (UniqueName: \"kubernetes.io/projected/038077d4-978f-40be-8216-70603ae81f5d-kube-api-access-xv9w7\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.154263 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038077d4-978f-40be-8216-70603ae81f5d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.154312 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038077d4-978f-40be-8216-70603ae81f5d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.163029 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038077d4-978f-40be-8216-70603ae81f5d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.164598 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038077d4-978f-40be-8216-70603ae81f5d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.174246 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv9w7\" (UniqueName: \"kubernetes.io/projected/038077d4-978f-40be-8216-70603ae81f5d-kube-api-access-xv9w7\") pod \"nova-cell1-conductor-0\" (UID: \"038077d4-978f-40be-8216-70603ae81f5d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.262222 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.320481 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.460151 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle\") pod \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.460269 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjk29\" (UniqueName: \"kubernetes.io/projected/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-kube-api-access-pjk29\") pod \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.460346 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-config-data\") pod \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.465377 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-kube-api-access-pjk29" (OuterVolumeSpecName: "kube-api-access-pjk29") pod "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" (UID: "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596"). InnerVolumeSpecName "kube-api-access-pjk29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:40:39 crc kubenswrapper[4958]: E1201 12:40:39.507518 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle podName:fcfdd8c8-b0e9-44ef-92b7-4968f37f1596 nodeName:}" failed. No retries permitted until 2025-12-01 12:40:40.007434768 +0000 UTC m=+9687.516223825 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle") pod "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" (UID: "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596") : error deleting /var/lib/kubelet/pods/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596/volume-subpaths: remove /var/lib/kubelet/pods/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596/volume-subpaths: no such file or directory Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.512256 4958 generic.go:334] "Generic (PLEG): container finished" podID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" exitCode=0 Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.512304 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596","Type":"ContainerDied","Data":"3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc"} Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.512328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596","Type":"ContainerDied","Data":"17e9c98cdbee80cebdac9656339b9544e746246b1fc759748f53184e00f1201c"} Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.512344 4958 scope.go:117] "RemoveContainer" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.512450 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.513333 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-config-data" (OuterVolumeSpecName: "config-data") pod "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" (UID: "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.519053 4958 generic.go:334] "Generic (PLEG): container finished" podID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerID="45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0" exitCode=143 Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.519095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac026d41-87e6-44c4-8597-c6fa860ba9e5","Type":"ContainerDied","Data":"45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0"} Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.549637 4958 scope.go:117] "RemoveContainer" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" Dec 01 12:40:39 crc kubenswrapper[4958]: E1201 12:40:39.550074 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc\": container with ID starting with 3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc not found: ID does not exist" containerID="3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.550115 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc"} err="failed to get container status \"3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc\": rpc error: code = NotFound desc = could not find container \"3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc\": container with ID starting with 3a020ded2dc6df015c0d068a12a90b7a3839ded0b9cb50c7ce64150f48b6a8fc not found: ID does not exist" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.565383 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.566061 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjk29\" (UniqueName: \"kubernetes.io/projected/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-kube-api-access-pjk29\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.819874 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d20bd7-d045-43e1-a954-a7ef8d11b3d7" path="/var/lib/kubelet/pods/01d20bd7-d045-43e1-a954-a7ef8d11b3d7/volumes" Dec 01 12:40:39 crc kubenswrapper[4958]: I1201 12:40:39.937281 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.076391 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle\") pod \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\" (UID: \"fcfdd8c8-b0e9-44ef-92b7-4968f37f1596\") " Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.080834 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" (UID: "fcfdd8c8-b0e9-44ef-92b7-4968f37f1596"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.182873 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.190227 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.222704 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.232650 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 12:40:40 crc kubenswrapper[4958]: E1201 12:40:40.233278 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" containerName="nova-cell0-conductor-conductor" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.233298 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" containerName="nova-cell0-conductor-conductor" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.233556 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" containerName="nova-cell0-conductor-conductor" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.234553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.239516 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.242750 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.285082 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb1633c-4acc-4286-8bdb-465623aea592-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.285251 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb1633c-4acc-4286-8bdb-465623aea592-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.285331 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntls\" (UniqueName: \"kubernetes.io/projected/acb1633c-4acc-4286-8bdb-465623aea592-kube-api-access-fntls\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.387293 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb1633c-4acc-4286-8bdb-465623aea592-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.387430 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb1633c-4acc-4286-8bdb-465623aea592-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.387496 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fntls\" (UniqueName: \"kubernetes.io/projected/acb1633c-4acc-4286-8bdb-465623aea592-kube-api-access-fntls\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.393273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb1633c-4acc-4286-8bdb-465623aea592-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.401336 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb1633c-4acc-4286-8bdb-465623aea592-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.412615 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntls\" (UniqueName: \"kubernetes.io/projected/acb1633c-4acc-4286-8bdb-465623aea592-kube-api-access-fntls\") pod \"nova-cell0-conductor-0\" (UID: \"acb1633c-4acc-4286-8bdb-465623aea592\") " pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.529994 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"038077d4-978f-40be-8216-70603ae81f5d","Type":"ContainerStarted","Data":"955d2b0c0e751ae4dc612b11400fffd4b6f5901c474abca1095045cb0fc88b3f"} Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.530050 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"038077d4-978f-40be-8216-70603ae81f5d","Type":"ContainerStarted","Data":"72fb66d196c715df80375f887d3bc1d59c49d628467be431447425585913f895"} Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.530130 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.552387 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.552350212 podStartE2EDuration="2.552350212s" podCreationTimestamp="2025-12-01 12:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:40:40.545088548 +0000 UTC m=+9688.053877595" watchObservedRunningTime="2025-12-01 12:40:40.552350212 +0000 UTC m=+9688.061139259" Dec 01 12:40:40 crc kubenswrapper[4958]: I1201 12:40:40.563697 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:41 crc kubenswrapper[4958]: I1201 12:40:41.078676 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 12:40:41 crc kubenswrapper[4958]: I1201 12:40:41.842583 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcfdd8c8-b0e9-44ef-92b7-4968f37f1596" path="/var/lib/kubelet/pods/fcfdd8c8-b0e9-44ef-92b7-4968f37f1596/volumes" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.170546 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.244285 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-combined-ca-bundle\") pod \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.244344 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-config-data\") pod \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.244415 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac026d41-87e6-44c4-8597-c6fa860ba9e5-logs\") pod \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.245557 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac026d41-87e6-44c4-8597-c6fa860ba9e5-logs" (OuterVolumeSpecName: "logs") pod "ac026d41-87e6-44c4-8597-c6fa860ba9e5" (UID: "ac026d41-87e6-44c4-8597-c6fa860ba9e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.245629 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmdmk\" (UniqueName: \"kubernetes.io/projected/ac026d41-87e6-44c4-8597-c6fa860ba9e5-kube-api-access-jmdmk\") pod \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\" (UID: \"ac026d41-87e6-44c4-8597-c6fa860ba9e5\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.266080 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac026d41-87e6-44c4-8597-c6fa860ba9e5-kube-api-access-jmdmk" (OuterVolumeSpecName: "kube-api-access-jmdmk") pod "ac026d41-87e6-44c4-8597-c6fa860ba9e5" (UID: "ac026d41-87e6-44c4-8597-c6fa860ba9e5"). InnerVolumeSpecName "kube-api-access-jmdmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.291243 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-config-data" (OuterVolumeSpecName: "config-data") pod "ac026d41-87e6-44c4-8597-c6fa860ba9e5" (UID: "ac026d41-87e6-44c4-8597-c6fa860ba9e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.313179 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac026d41-87e6-44c4-8597-c6fa860ba9e5" (UID: "ac026d41-87e6-44c4-8597-c6fa860ba9e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.347605 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.347651 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac026d41-87e6-44c4-8597-c6fa860ba9e5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.347666 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac026d41-87e6-44c4-8597-c6fa860ba9e5-logs\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.347679 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmdmk\" (UniqueName: \"kubernetes.io/projected/ac026d41-87e6-44c4-8597-c6fa860ba9e5-kube-api-access-jmdmk\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.407344 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.552598 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-combined-ca-bundle\") pod \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.553166 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1fa21c-37dc-47a2-82f3-b68e89651d04-logs\") pod \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.553231 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-config-data\") pod \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.553281 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqnmc\" (UniqueName: \"kubernetes.io/projected/9c1fa21c-37dc-47a2-82f3-b68e89651d04-kube-api-access-nqnmc\") pod \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\" (UID: \"9c1fa21c-37dc-47a2-82f3-b68e89651d04\") " Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.553364 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"acb1633c-4acc-4286-8bdb-465623aea592","Type":"ContainerStarted","Data":"a340f0e62bfc31a7c01d30a27754d1efdf3c3a16c04614552d20e093fdb9ec09"} Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.553408 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"acb1633c-4acc-4286-8bdb-465623aea592","Type":"ContainerStarted","Data":"91a9f4d226c4574b9aa9cf67e2b68598833038076a407f1377460dc64d7748d5"} Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.553785 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c1fa21c-37dc-47a2-82f3-b68e89651d04-logs" (OuterVolumeSpecName: "logs") pod "9c1fa21c-37dc-47a2-82f3-b68e89651d04" (UID: "9c1fa21c-37dc-47a2-82f3-b68e89651d04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.554093 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.555130 4958 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c1fa21c-37dc-47a2-82f3-b68e89651d04-logs\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.558107 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1fa21c-37dc-47a2-82f3-b68e89651d04-kube-api-access-nqnmc" (OuterVolumeSpecName: "kube-api-access-nqnmc") pod "9c1fa21c-37dc-47a2-82f3-b68e89651d04" (UID: "9c1fa21c-37dc-47a2-82f3-b68e89651d04"). InnerVolumeSpecName "kube-api-access-nqnmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.559879 4958 generic.go:334] "Generic (PLEG): container finished" podID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerID="efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e" exitCode=0 Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.559944 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac026d41-87e6-44c4-8597-c6fa860ba9e5","Type":"ContainerDied","Data":"efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e"} Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.559971 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac026d41-87e6-44c4-8597-c6fa860ba9e5","Type":"ContainerDied","Data":"fa6cd2dcbc00c73af19ca2beb1abea78fa9f747a9091126a81988dfca31631ab"} Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.559988 4958 scope.go:117] "RemoveContainer" containerID="efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.560103 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.584627 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.584606852 podStartE2EDuration="2.584606852s" podCreationTimestamp="2025-12-01 12:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:40:42.573617043 +0000 UTC m=+9690.082406080" watchObservedRunningTime="2025-12-01 12:40:42.584606852 +0000 UTC m=+9690.093395889" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.593370 4958 generic.go:334] "Generic (PLEG): container finished" podID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerID="4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3" exitCode=0 Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.593428 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c1fa21c-37dc-47a2-82f3-b68e89651d04","Type":"ContainerDied","Data":"4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3"} Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.593472 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c1fa21c-37dc-47a2-82f3-b68e89651d04","Type":"ContainerDied","Data":"abfe2b9a78e77af371aefa1e69c1ffd1d964acd44d51154e4eb9102554c4557f"} Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.593611 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c1fa21c-37dc-47a2-82f3-b68e89651d04" (UID: "9c1fa21c-37dc-47a2-82f3-b68e89651d04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.593972 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.616119 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-config-data" (OuterVolumeSpecName: "config-data") pod "9c1fa21c-37dc-47a2-82f3-b68e89651d04" (UID: "9c1fa21c-37dc-47a2-82f3-b68e89651d04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.616308 4958 scope.go:117] "RemoveContainer" containerID="45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.634597 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.646204 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.651822 4958 scope.go:117] "RemoveContainer" containerID="efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.652254 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e\": container with ID starting with efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e not found: ID does not exist" containerID="efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.652286 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e"} err="failed to get container status \"efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e\": rpc error: code = NotFound desc = could not find container \"efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e\": container with ID starting with efeada4674b238b839f1fde3677f6586324a55faf7a12bd837940b174a83292e not found: ID does not exist" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.652307 4958 scope.go:117] "RemoveContainer" containerID="45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.652548 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0\": container with ID starting with 45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0 not found: ID does not exist" containerID="45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.652572 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0"} err="failed to get container status \"45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0\": rpc error: code = NotFound desc = could not find container \"45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0\": container with ID starting with 45f06c383e3a65dd81ce4d39ebb9144c078e834b1aa7f964cf0ccc36a9db9cc0 not found: ID does not exist" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.652590 4958 scope.go:117] "RemoveContainer" containerID="4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.659049 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.659080 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqnmc\" (UniqueName: \"kubernetes.io/projected/9c1fa21c-37dc-47a2-82f3-b68e89651d04-kube-api-access-nqnmc\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.659090 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c1fa21c-37dc-47a2-82f3-b68e89651d04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.661286 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.661961 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-log" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.661976 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-log" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.661995 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-log" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662001 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-log" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.662024 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-api" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662032 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-api" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.662069 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-metadata" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662077 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-metadata" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662285 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-log" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662296 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" containerName="nova-metadata-metadata" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662303 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-log" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.662317 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" containerName="nova-api-api" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.663584 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.668962 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.686294 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.687497 4958 scope.go:117] "RemoveContainer" containerID="731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.707067 4958 scope.go:117] "RemoveContainer" containerID="4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.707402 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3\": container with ID starting with 4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3 not found: ID does not exist" containerID="4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.707433 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3"} err="failed to get container status \"4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3\": rpc error: code = NotFound desc = could not find container \"4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3\": container with ID starting with 4bf99aa149eda710cb7021082862fddf9f2cf8d9b17be7348e3ff673d78da1c3 not found: ID does not exist" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.707455 4958 scope.go:117] "RemoveContainer" containerID="731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a" Dec 01 12:40:42 crc kubenswrapper[4958]: E1201 12:40:42.707705 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a\": container with ID starting with 731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a not found: ID does not exist" containerID="731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.707726 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a"} err="failed to get container status \"731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a\": rpc error: code = NotFound desc = could not find container \"731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a\": container with ID starting with 731e0c8b065268e582ff1a8837a47e21d2a077c794197ba25fd8eec684d2123a not found: ID does not exist" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.767289 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-config-data\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.767412 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28v2h\" (UniqueName: \"kubernetes.io/projected/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-kube-api-access-28v2h\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.767485 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-logs\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.767688 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.855566 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth"] Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.858381 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.861034 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-zqbfz" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.864632 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.864728 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.865007 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.865601 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.865885 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.865940 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.869595 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-config-data\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.869824 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28v2h\" (UniqueName: \"kubernetes.io/projected/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-kube-api-access-28v2h\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.869927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-logs\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.870085 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.872174 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-logs\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.875937 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-config-data\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.877044 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:42 crc kubenswrapper[4958]: I1201 12:40:42.989488 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth"] Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.013418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28v2h\" (UniqueName: \"kubernetes.io/projected/e9d41cd6-c1b4-4b68-93a8-03fc7f446adb-kube-api-access-28v2h\") pod \"nova-metadata-0\" (UID: \"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb\") " pod="openstack/nova-metadata-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.040115 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.068780 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.078215 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.080370 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.083245 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.087912 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.097202 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.097630 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579r6\" (UniqueName: \"kubernetes.io/projected/dd221c10-e588-4b0e-9b5e-94735a063bac-kube-api-access-579r6\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.097821 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.097984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.098133 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.098320 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.098493 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.098678 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.098829 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.098983 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.099109 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.207596 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586e9523-0e51-4514-8e52-5a9877ed0c21-logs\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.207677 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579r6\" (UniqueName: \"kubernetes.io/projected/dd221c10-e588-4b0e-9b5e-94735a063bac-kube-api-access-579r6\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.207774 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.207808 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.207887 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e9523-0e51-4514-8e52-5a9877ed0c21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.207976 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208124 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45d84\" (UniqueName: \"kubernetes.io/projected/586e9523-0e51-4514-8e52-5a9877ed0c21-kube-api-access-45d84\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208329 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208442 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208492 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208559 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208605 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208691 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e9523-0e51-4514-8e52-5a9877ed0c21-config-data\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.208801 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.209856 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.210199 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.295112 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.310182 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45d84\" (UniqueName: \"kubernetes.io/projected/586e9523-0e51-4514-8e52-5a9877ed0c21-kube-api-access-45d84\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.310319 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e9523-0e51-4514-8e52-5a9877ed0c21-config-data\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.310407 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586e9523-0e51-4514-8e52-5a9877ed0c21-logs\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.310452 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e9523-0e51-4514-8e52-5a9877ed0c21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.311078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/586e9523-0e51-4514-8e52-5a9877ed0c21-logs\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: E1201 12:40:43.353770 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 12:40:43 crc kubenswrapper[4958]: E1201 12:40:43.362501 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 12:40:43 crc kubenswrapper[4958]: E1201 12:40:43.365583 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 12:40:43 crc kubenswrapper[4958]: E1201 12:40:43.365659 4958 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerName="nova-scheduler-scheduler" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.673478 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.673777 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.673998 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.678290 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.685801 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.686280 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.687181 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.687353 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e9523-0e51-4514-8e52-5a9877ed0c21-config-data\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.695582 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e9523-0e51-4514-8e52-5a9877ed0c21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.701703 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.702000 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579r6\" (UniqueName: \"kubernetes.io/projected/dd221c10-e588-4b0e-9b5e-94735a063bac-kube-api-access-579r6\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.707017 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45d84\" (UniqueName: \"kubernetes.io/projected/586e9523-0e51-4514-8e52-5a9877ed0c21-kube-api-access-45d84\") pod \"nova-api-0\" (UID: \"586e9523-0e51-4514-8e52-5a9877ed0c21\") " pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.817328 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.817482 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1fa21c-37dc-47a2-82f3-b68e89651d04" path="/var/lib/kubelet/pods/9c1fa21c-37dc-47a2-82f3-b68e89651d04/volumes" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.818802 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac026d41-87e6-44c4-8597-c6fa860ba9e5" path="/var/lib/kubelet/pods/ac026d41-87e6-44c4-8597-c6fa860ba9e5/volumes" Dec 01 12:40:43 crc kubenswrapper[4958]: I1201 12:40:43.887579 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:40:44 crc kubenswrapper[4958]: I1201 12:40:44.308807 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 12:40:44 crc kubenswrapper[4958]: I1201 12:40:44.406584 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 12:40:44 crc kubenswrapper[4958]: I1201 12:40:44.589286 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth"] Dec 01 12:40:44 crc kubenswrapper[4958]: W1201 12:40:44.597280 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd221c10_e588_4b0e_9b5e_94735a063bac.slice/crio-51a181114a8d6e882633fe99914440841aa7988f1ec6178c8f1b70350d3288ff WatchSource:0}: Error finding container 51a181114a8d6e882633fe99914440841aa7988f1ec6178c8f1b70350d3288ff: Status 404 returned error can't find the container with id 51a181114a8d6e882633fe99914440841aa7988f1ec6178c8f1b70350d3288ff Dec 01 12:40:44 crc kubenswrapper[4958]: I1201 12:40:44.629646 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb","Type":"ContainerStarted","Data":"a4800bfbb0e0bd14ff952318ba1dadb20b321b739e38fb65387eebe9d766ff91"} Dec 01 12:40:44 crc kubenswrapper[4958]: I1201 12:40:44.630863 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"586e9523-0e51-4514-8e52-5a9877ed0c21","Type":"ContainerStarted","Data":"08d9361d76498e81ef96a2b9b271c6c7ad7636070b7f943b575073875c1cee4c"} Dec 01 12:40:44 crc kubenswrapper[4958]: I1201 12:40:44.631988 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" event={"ID":"dd221c10-e588-4b0e-9b5e-94735a063bac","Type":"ContainerStarted","Data":"51a181114a8d6e882633fe99914440841aa7988f1ec6178c8f1b70350d3288ff"} Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.680461 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"586e9523-0e51-4514-8e52-5a9877ed0c21","Type":"ContainerStarted","Data":"b594bfedb7b1e3f3c6646998684cf222234165eacf5cbf94525b9b098a1a2630"} Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.681518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"586e9523-0e51-4514-8e52-5a9877ed0c21","Type":"ContainerStarted","Data":"1a8e98b881092c6e17c43dd7b775d3812eb8699424a66660cc11f4934af6ffa0"} Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.690422 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" event={"ID":"dd221c10-e588-4b0e-9b5e-94735a063bac","Type":"ContainerStarted","Data":"aa74fca35db7fb4b1fec0f381e4bfc69cf6fa978c3c59e409123a29963bca06d"} Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.692929 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb","Type":"ContainerStarted","Data":"c798299ec76c6068ef0d4b46657898ffed0ee57bcb8f795bf9fb0dd57a9b99cc"} Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.692973 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9d41cd6-c1b4-4b68-93a8-03fc7f446adb","Type":"ContainerStarted","Data":"283fa262d03a57a34a2a07f41dd62279d87d546f0a54885575af32a184f0d107"} Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.705024 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.705001323 podStartE2EDuration="2.705001323s" podCreationTimestamp="2025-12-01 12:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:40:45.696026341 +0000 UTC m=+9693.204815378" watchObservedRunningTime="2025-12-01 12:40:45.705001323 +0000 UTC m=+9693.213790360" Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.732505 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.732481417 podStartE2EDuration="3.732481417s" podCreationTimestamp="2025-12-01 12:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:40:45.726411076 +0000 UTC m=+9693.235200113" watchObservedRunningTime="2025-12-01 12:40:45.732481417 +0000 UTC m=+9693.241270454" Dec 01 12:40:45 crc kubenswrapper[4958]: I1201 12:40:45.774951 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" podStartSLOduration=3.5761013459999997 podStartE2EDuration="3.774925011s" podCreationTimestamp="2025-12-01 12:40:42 +0000 UTC" firstStartedPulling="2025-12-01 12:40:44.599653707 +0000 UTC m=+9692.108442744" lastFinishedPulling="2025-12-01 12:40:44.798477372 +0000 UTC m=+9692.307266409" observedRunningTime="2025-12-01 12:40:45.770424554 +0000 UTC m=+9693.279213591" watchObservedRunningTime="2025-12-01 12:40:45.774925011 +0000 UTC m=+9693.283714048" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.295375 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.296114 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 12:40:48 crc kubenswrapper[4958]: E1201 12:40:48.352595 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed is running failed: container process not found" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 12:40:48 crc kubenswrapper[4958]: E1201 12:40:48.353713 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed is running failed: container process not found" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 12:40:48 crc kubenswrapper[4958]: E1201 12:40:48.355273 4958 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed is running failed: container process not found" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 12:40:48 crc kubenswrapper[4958]: E1201 12:40:48.355333 4958 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerName="nova-scheduler-scheduler" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.358832 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.547051 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq77f\" (UniqueName: \"kubernetes.io/projected/9db9b6d2-f515-4f0e-83b6-d56ee091734f-kube-api-access-bq77f\") pod \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.550634 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-config-data\") pod \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.550773 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-combined-ca-bundle\") pod \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\" (UID: \"9db9b6d2-f515-4f0e-83b6-d56ee091734f\") " Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.562311 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db9b6d2-f515-4f0e-83b6-d56ee091734f-kube-api-access-bq77f" (OuterVolumeSpecName: "kube-api-access-bq77f") pod "9db9b6d2-f515-4f0e-83b6-d56ee091734f" (UID: "9db9b6d2-f515-4f0e-83b6-d56ee091734f"). InnerVolumeSpecName "kube-api-access-bq77f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.579520 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-config-data" (OuterVolumeSpecName: "config-data") pod "9db9b6d2-f515-4f0e-83b6-d56ee091734f" (UID: "9db9b6d2-f515-4f0e-83b6-d56ee091734f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.592592 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9db9b6d2-f515-4f0e-83b6-d56ee091734f" (UID: "9db9b6d2-f515-4f0e-83b6-d56ee091734f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.653340 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.653587 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq77f\" (UniqueName: \"kubernetes.io/projected/9db9b6d2-f515-4f0e-83b6-d56ee091734f-kube-api-access-bq77f\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.653683 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db9b6d2-f515-4f0e-83b6-d56ee091734f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.740740 4958 generic.go:334] "Generic (PLEG): container finished" podID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" exitCode=0 Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.740812 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9db9b6d2-f515-4f0e-83b6-d56ee091734f","Type":"ContainerDied","Data":"79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed"} Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.740880 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9db9b6d2-f515-4f0e-83b6-d56ee091734f","Type":"ContainerDied","Data":"1d7a9095126bbcc0f1a74be4cb6c78d610983a19f7e319dd4ee83432842ba2f3"} Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.740910 4958 scope.go:117] "RemoveContainer" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.740921 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.802874 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.808159 4958 scope.go:117] "RemoveContainer" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" Dec 01 12:40:48 crc kubenswrapper[4958]: E1201 12:40:48.811155 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed\": container with ID starting with 79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed not found: ID does not exist" containerID="79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.811218 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed"} err="failed to get container status \"79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed\": rpc error: code = NotFound desc = could not find container \"79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed\": container with ID starting with 79c67a42febf5599e34ce68c2c0fb6daaf2ed764b2ccc05f3cc83e1133d831ed not found: ID does not exist" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.814803 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.843753 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 12:40:48 crc kubenswrapper[4958]: E1201 12:40:48.844941 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerName="nova-scheduler-scheduler" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.845111 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerName="nova-scheduler-scheduler" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.845711 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" containerName="nova-scheduler-scheduler" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.847288 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.855347 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.857100 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f94gh\" (UniqueName: \"kubernetes.io/projected/333c9290-821d-464e-b587-8c4253eeb732-kube-api-access-f94gh\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.857270 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333c9290-821d-464e-b587-8c4253eeb732-config-data\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.857392 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333c9290-821d-464e-b587-8c4253eeb732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.863913 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.959446 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333c9290-821d-464e-b587-8c4253eeb732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.959515 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f94gh\" (UniqueName: \"kubernetes.io/projected/333c9290-821d-464e-b587-8c4253eeb732-kube-api-access-f94gh\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.959618 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333c9290-821d-464e-b587-8c4253eeb732-config-data\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.962992 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/333c9290-821d-464e-b587-8c4253eeb732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.963054 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/333c9290-821d-464e-b587-8c4253eeb732-config-data\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:48 crc kubenswrapper[4958]: I1201 12:40:48.974346 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f94gh\" (UniqueName: \"kubernetes.io/projected/333c9290-821d-464e-b587-8c4253eeb732-kube-api-access-f94gh\") pod \"nova-scheduler-0\" (UID: \"333c9290-821d-464e-b587-8c4253eeb732\") " pod="openstack/nova-scheduler-0" Dec 01 12:40:49 crc kubenswrapper[4958]: I1201 12:40:49.176739 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 12:40:49 crc kubenswrapper[4958]: I1201 12:40:49.371028 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 12:40:49 crc kubenswrapper[4958]: W1201 12:40:49.735609 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod333c9290_821d_464e_b587_8c4253eeb732.slice/crio-97bffd7f31281cdd42d6b74c0c84759f7db0cc877c84fbcf4dee0633f0714615 WatchSource:0}: Error finding container 97bffd7f31281cdd42d6b74c0c84759f7db0cc877c84fbcf4dee0633f0714615: Status 404 returned error can't find the container with id 97bffd7f31281cdd42d6b74c0c84759f7db0cc877c84fbcf4dee0633f0714615 Dec 01 12:40:49 crc kubenswrapper[4958]: I1201 12:40:49.738203 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 12:40:49 crc kubenswrapper[4958]: I1201 12:40:49.760091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"333c9290-821d-464e-b587-8c4253eeb732","Type":"ContainerStarted","Data":"97bffd7f31281cdd42d6b74c0c84759f7db0cc877c84fbcf4dee0633f0714615"} Dec 01 12:40:49 crc kubenswrapper[4958]: I1201 12:40:49.817907 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db9b6d2-f515-4f0e-83b6-d56ee091734f" path="/var/lib/kubelet/pods/9db9b6d2-f515-4f0e-83b6-d56ee091734f/volumes" Dec 01 12:40:50 crc kubenswrapper[4958]: I1201 12:40:50.626447 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 12:40:50 crc kubenswrapper[4958]: I1201 12:40:50.786920 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"333c9290-821d-464e-b587-8c4253eeb732","Type":"ContainerStarted","Data":"e54b250929268eea2d79a6475f03d597bfeb44cf2f989bf1c19d313d1bdd5905"} Dec 01 12:40:50 crc kubenswrapper[4958]: I1201 12:40:50.806943 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.806917855 podStartE2EDuration="2.806917855s" podCreationTimestamp="2025-12-01 12:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 12:40:50.800508545 +0000 UTC m=+9698.309297572" watchObservedRunningTime="2025-12-01 12:40:50.806917855 +0000 UTC m=+9698.315706902" Dec 01 12:40:53 crc kubenswrapper[4958]: I1201 12:40:53.296074 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 12:40:53 crc kubenswrapper[4958]: I1201 12:40:53.298121 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 12:40:53 crc kubenswrapper[4958]: I1201 12:40:53.837596 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 12:40:53 crc kubenswrapper[4958]: I1201 12:40:53.837999 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 12:40:54 crc kubenswrapper[4958]: I1201 12:40:54.177257 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 12:40:54 crc kubenswrapper[4958]: I1201 12:40:54.379041 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e9d41cd6-c1b4-4b68-93a8-03fc7f446adb" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.192:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 12:40:54 crc kubenswrapper[4958]: I1201 12:40:54.379113 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e9d41cd6-c1b4-4b68-93a8-03fc7f446adb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.192:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 12:40:54 crc kubenswrapper[4958]: I1201 12:40:54.909325 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="586e9523-0e51-4514-8e52-5a9877ed0c21" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 12:40:54 crc kubenswrapper[4958]: I1201 12:40:54.909373 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="586e9523-0e51-4514-8e52-5a9877ed0c21" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 12:40:59 crc kubenswrapper[4958]: I1201 12:40:59.178814 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 12:40:59 crc kubenswrapper[4958]: I1201 12:40:59.244706 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 12:40:59 crc kubenswrapper[4958]: I1201 12:40:59.987418 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.300483 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.301135 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.304777 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.305838 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.823998 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.824754 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.825222 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.827494 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.978677 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 12:41:03 crc kubenswrapper[4958]: I1201 12:41:03.985624 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.259204 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qv489"] Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.262596 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.294923 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-catalog-content\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.295297 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-utilities\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.295476 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjhdd\" (UniqueName: \"kubernetes.io/projected/b1799dd4-e619-4436-aa52-7ab2ba3c4623-kube-api-access-xjhdd\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.301932 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qv489"] Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.398388 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjhdd\" (UniqueName: \"kubernetes.io/projected/b1799dd4-e619-4436-aa52-7ab2ba3c4623-kube-api-access-xjhdd\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.398704 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-catalog-content\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.398738 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-utilities\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.399767 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-utilities\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.401216 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-catalog-content\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.438830 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjhdd\" (UniqueName: \"kubernetes.io/projected/b1799dd4-e619-4436-aa52-7ab2ba3c4623-kube-api-access-xjhdd\") pod \"redhat-marketplace-qv489\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:22 crc kubenswrapper[4958]: I1201 12:41:22.603055 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:23 crc kubenswrapper[4958]: I1201 12:41:23.167456 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qv489"] Dec 01 12:41:23 crc kubenswrapper[4958]: W1201 12:41:23.192403 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1799dd4_e619_4436_aa52_7ab2ba3c4623.slice/crio-71e325490362e6a377b042c3ca08bca5c26cfd150093e9bc3c6255ff9f1fda89 WatchSource:0}: Error finding container 71e325490362e6a377b042c3ca08bca5c26cfd150093e9bc3c6255ff9f1fda89: Status 404 returned error can't find the container with id 71e325490362e6a377b042c3ca08bca5c26cfd150093e9bc3c6255ff9f1fda89 Dec 01 12:41:23 crc kubenswrapper[4958]: I1201 12:41:23.276328 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qv489" event={"ID":"b1799dd4-e619-4436-aa52-7ab2ba3c4623","Type":"ContainerStarted","Data":"71e325490362e6a377b042c3ca08bca5c26cfd150093e9bc3c6255ff9f1fda89"} Dec 01 12:41:24 crc kubenswrapper[4958]: I1201 12:41:24.294086 4958 generic.go:334] "Generic (PLEG): container finished" podID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerID="eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c" exitCode=0 Dec 01 12:41:24 crc kubenswrapper[4958]: I1201 12:41:24.294260 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qv489" event={"ID":"b1799dd4-e619-4436-aa52-7ab2ba3c4623","Type":"ContainerDied","Data":"eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c"} Dec 01 12:41:26 crc kubenswrapper[4958]: I1201 12:41:26.363794 4958 generic.go:334] "Generic (PLEG): container finished" podID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerID="7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d" exitCode=0 Dec 01 12:41:26 crc kubenswrapper[4958]: I1201 12:41:26.363915 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qv489" event={"ID":"b1799dd4-e619-4436-aa52-7ab2ba3c4623","Type":"ContainerDied","Data":"7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d"} Dec 01 12:41:28 crc kubenswrapper[4958]: I1201 12:41:28.442488 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qv489" event={"ID":"b1799dd4-e619-4436-aa52-7ab2ba3c4623","Type":"ContainerStarted","Data":"0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a"} Dec 01 12:41:32 crc kubenswrapper[4958]: I1201 12:41:32.603266 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:32 crc kubenswrapper[4958]: I1201 12:41:32.604224 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:33 crc kubenswrapper[4958]: I1201 12:41:33.166493 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:33 crc kubenswrapper[4958]: I1201 12:41:33.192680 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qv489" podStartSLOduration=8.024143556 podStartE2EDuration="11.192660221s" podCreationTimestamp="2025-12-01 12:41:22 +0000 UTC" firstStartedPulling="2025-12-01 12:41:24.298262324 +0000 UTC m=+9731.807051371" lastFinishedPulling="2025-12-01 12:41:27.466778989 +0000 UTC m=+9734.975568036" observedRunningTime="2025-12-01 12:41:28.47280856 +0000 UTC m=+9735.981597627" watchObservedRunningTime="2025-12-01 12:41:33.192660221 +0000 UTC m=+9740.701449268" Dec 01 12:41:33 crc kubenswrapper[4958]: I1201 12:41:33.593547 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:33 crc kubenswrapper[4958]: I1201 12:41:33.670327 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qv489"] Dec 01 12:41:35 crc kubenswrapper[4958]: I1201 12:41:35.542028 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qv489" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="registry-server" containerID="cri-o://0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a" gracePeriod=2 Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.322572 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.389975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-catalog-content\") pod \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.402484 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjhdd\" (UniqueName: \"kubernetes.io/projected/b1799dd4-e619-4436-aa52-7ab2ba3c4623-kube-api-access-xjhdd\") pod \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.403227 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-utilities\") pod \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\" (UID: \"b1799dd4-e619-4436-aa52-7ab2ba3c4623\") " Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.405902 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-utilities" (OuterVolumeSpecName: "utilities") pod "b1799dd4-e619-4436-aa52-7ab2ba3c4623" (UID: "b1799dd4-e619-4436-aa52-7ab2ba3c4623"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.408327 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1799dd4-e619-4436-aa52-7ab2ba3c4623-kube-api-access-xjhdd" (OuterVolumeSpecName: "kube-api-access-xjhdd") pod "b1799dd4-e619-4436-aa52-7ab2ba3c4623" (UID: "b1799dd4-e619-4436-aa52-7ab2ba3c4623"). InnerVolumeSpecName "kube-api-access-xjhdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.415582 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1799dd4-e619-4436-aa52-7ab2ba3c4623" (UID: "b1799dd4-e619-4436-aa52-7ab2ba3c4623"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.508311 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjhdd\" (UniqueName: \"kubernetes.io/projected/b1799dd4-e619-4436-aa52-7ab2ba3c4623-kube-api-access-xjhdd\") on node \"crc\" DevicePath \"\"" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.508776 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.508800 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1799dd4-e619-4436-aa52-7ab2ba3c4623-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.554577 4958 generic.go:334] "Generic (PLEG): container finished" podID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerID="0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a" exitCode=0 Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.554622 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qv489" event={"ID":"b1799dd4-e619-4436-aa52-7ab2ba3c4623","Type":"ContainerDied","Data":"0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a"} Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.554655 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qv489" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.554688 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qv489" event={"ID":"b1799dd4-e619-4436-aa52-7ab2ba3c4623","Type":"ContainerDied","Data":"71e325490362e6a377b042c3ca08bca5c26cfd150093e9bc3c6255ff9f1fda89"} Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.554709 4958 scope.go:117] "RemoveContainer" containerID="0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.578661 4958 scope.go:117] "RemoveContainer" containerID="7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.609302 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qv489"] Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.616913 4958 scope.go:117] "RemoveContainer" containerID="eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.623248 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qv489"] Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.674902 4958 scope.go:117] "RemoveContainer" containerID="0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a" Dec 01 12:41:36 crc kubenswrapper[4958]: E1201 12:41:36.675487 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a\": container with ID starting with 0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a not found: ID does not exist" containerID="0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.675561 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a"} err="failed to get container status \"0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a\": rpc error: code = NotFound desc = could not find container \"0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a\": container with ID starting with 0250a2f91a5366b2b73374acbcf497903672ce62b3fdbbd3418856340092b72a not found: ID does not exist" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.675606 4958 scope.go:117] "RemoveContainer" containerID="7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d" Dec 01 12:41:36 crc kubenswrapper[4958]: E1201 12:41:36.676034 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d\": container with ID starting with 7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d not found: ID does not exist" containerID="7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.676077 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d"} err="failed to get container status \"7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d\": rpc error: code = NotFound desc = could not find container \"7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d\": container with ID starting with 7269ce2faa09ad6b08b7cc9f53d0a76a24e1596137a4c8b4fd125755398aa08d not found: ID does not exist" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.676108 4958 scope.go:117] "RemoveContainer" containerID="eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c" Dec 01 12:41:36 crc kubenswrapper[4958]: E1201 12:41:36.676444 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c\": container with ID starting with eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c not found: ID does not exist" containerID="eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c" Dec 01 12:41:36 crc kubenswrapper[4958]: I1201 12:41:36.676492 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c"} err="failed to get container status \"eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c\": rpc error: code = NotFound desc = could not find container \"eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c\": container with ID starting with eafe76e337b0b0662dbb198eae4c6cd8a8d1b5dd45a931faed82dd262b06650c not found: ID does not exist" Dec 01 12:41:37 crc kubenswrapper[4958]: I1201 12:41:37.825655 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" path="/var/lib/kubelet/pods/b1799dd4-e619-4436-aa52-7ab2ba3c4623/volumes" Dec 01 12:42:28 crc kubenswrapper[4958]: I1201 12:42:28.210377 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:42:28 crc kubenswrapper[4958]: I1201 12:42:28.211013 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:42:58 crc kubenswrapper[4958]: I1201 12:42:58.211002 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:42:58 crc kubenswrapper[4958]: I1201 12:42:58.211728 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.210437 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.211088 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.211146 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.212263 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.212352 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" gracePeriod=600 Dec 01 12:43:28 crc kubenswrapper[4958]: E1201 12:43:28.343940 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.469209 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" exitCode=0 Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.469339 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d"} Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.470536 4958 scope.go:117] "RemoveContainer" containerID="5de11fd4226fa13b8a6c00a4614611530deaaa072533bcdc378d32d5b9707f26" Dec 01 12:43:28 crc kubenswrapper[4958]: I1201 12:43:28.471769 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:43:28 crc kubenswrapper[4958]: E1201 12:43:28.472346 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:43:40 crc kubenswrapper[4958]: I1201 12:43:40.798518 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:43:40 crc kubenswrapper[4958]: E1201 12:43:40.799640 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:43:51 crc kubenswrapper[4958]: I1201 12:43:51.797643 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:43:51 crc kubenswrapper[4958]: E1201 12:43:51.798570 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:44:04 crc kubenswrapper[4958]: I1201 12:44:04.798148 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:44:04 crc kubenswrapper[4958]: E1201 12:44:04.799286 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:44:15 crc kubenswrapper[4958]: I1201 12:44:15.798407 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:44:15 crc kubenswrapper[4958]: E1201 12:44:15.799510 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:44:30 crc kubenswrapper[4958]: I1201 12:44:30.797966 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:44:30 crc kubenswrapper[4958]: E1201 12:44:30.799106 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:44:36 crc kubenswrapper[4958]: I1201 12:44:36.564155 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="05601e59-50a7-45b6-a41d-d872a023490b" containerName="galera" probeResult="failure" output="command timed out" Dec 01 12:44:36 crc kubenswrapper[4958]: I1201 12:44:36.564248 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="05601e59-50a7-45b6-a41d-d872a023490b" containerName="galera" probeResult="failure" output="command timed out" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.945442 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wgkcp"] Dec 01 12:44:38 crc kubenswrapper[4958]: E1201 12:44:38.946825 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="extract-utilities" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.946881 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="extract-utilities" Dec 01 12:44:38 crc kubenswrapper[4958]: E1201 12:44:38.946930 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="extract-content" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.946944 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="extract-content" Dec 01 12:44:38 crc kubenswrapper[4958]: E1201 12:44:38.946968 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="registry-server" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.946980 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="registry-server" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.947385 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1799dd4-e619-4436-aa52-7ab2ba3c4623" containerName="registry-server" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.950059 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:38 crc kubenswrapper[4958]: I1201 12:44:38.981292 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgkcp"] Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.007406 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-utilities\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.007504 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-catalog-content\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.007719 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5pd7\" (UniqueName: \"kubernetes.io/projected/e2875394-d190-4c98-816b-05c8c139d9dc-kube-api-access-r5pd7\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.109438 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-utilities\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.109560 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-catalog-content\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.109762 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5pd7\" (UniqueName: \"kubernetes.io/projected/e2875394-d190-4c98-816b-05c8c139d9dc-kube-api-access-r5pd7\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.110202 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-catalog-content\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.110463 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-utilities\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.165226 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5pd7\" (UniqueName: \"kubernetes.io/projected/e2875394-d190-4c98-816b-05c8c139d9dc-kube-api-access-r5pd7\") pod \"community-operators-wgkcp\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.281876 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:39 crc kubenswrapper[4958]: I1201 12:44:39.872834 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgkcp"] Dec 01 12:44:39 crc kubenswrapper[4958]: W1201 12:44:39.882175 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2875394_d190_4c98_816b_05c8c139d9dc.slice/crio-b6e2fe3a100813b3c3ef3fe8d58882196703353313f63a947cb417f78de55384 WatchSource:0}: Error finding container b6e2fe3a100813b3c3ef3fe8d58882196703353313f63a947cb417f78de55384: Status 404 returned error can't find the container with id b6e2fe3a100813b3c3ef3fe8d58882196703353313f63a947cb417f78de55384 Dec 01 12:44:40 crc kubenswrapper[4958]: I1201 12:44:40.455456 4958 generic.go:334] "Generic (PLEG): container finished" podID="e2875394-d190-4c98-816b-05c8c139d9dc" containerID="34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916" exitCode=0 Dec 01 12:44:40 crc kubenswrapper[4958]: I1201 12:44:40.455511 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkcp" event={"ID":"e2875394-d190-4c98-816b-05c8c139d9dc","Type":"ContainerDied","Data":"34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916"} Dec 01 12:44:40 crc kubenswrapper[4958]: I1201 12:44:40.455694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkcp" event={"ID":"e2875394-d190-4c98-816b-05c8c139d9dc","Type":"ContainerStarted","Data":"b6e2fe3a100813b3c3ef3fe8d58882196703353313f63a947cb417f78de55384"} Dec 01 12:44:40 crc kubenswrapper[4958]: I1201 12:44:40.457625 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:44:41 crc kubenswrapper[4958]: I1201 12:44:41.797984 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:44:41 crc kubenswrapper[4958]: E1201 12:44:41.798729 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:44:42 crc kubenswrapper[4958]: I1201 12:44:42.490582 4958 generic.go:334] "Generic (PLEG): container finished" podID="e2875394-d190-4c98-816b-05c8c139d9dc" containerID="9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac" exitCode=0 Dec 01 12:44:42 crc kubenswrapper[4958]: I1201 12:44:42.490633 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkcp" event={"ID":"e2875394-d190-4c98-816b-05c8c139d9dc","Type":"ContainerDied","Data":"9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac"} Dec 01 12:44:44 crc kubenswrapper[4958]: I1201 12:44:44.542782 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkcp" event={"ID":"e2875394-d190-4c98-816b-05c8c139d9dc","Type":"ContainerStarted","Data":"bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e"} Dec 01 12:44:44 crc kubenswrapper[4958]: I1201 12:44:44.569851 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wgkcp" podStartSLOduration=3.727026382 podStartE2EDuration="6.569826092s" podCreationTimestamp="2025-12-01 12:44:38 +0000 UTC" firstStartedPulling="2025-12-01 12:44:40.457351073 +0000 UTC m=+9927.966140110" lastFinishedPulling="2025-12-01 12:44:43.300150743 +0000 UTC m=+9930.808939820" observedRunningTime="2025-12-01 12:44:44.564961155 +0000 UTC m=+9932.073750192" watchObservedRunningTime="2025-12-01 12:44:44.569826092 +0000 UTC m=+9932.078615129" Dec 01 12:44:49 crc kubenswrapper[4958]: I1201 12:44:49.282483 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:49 crc kubenswrapper[4958]: I1201 12:44:49.283962 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:49 crc kubenswrapper[4958]: I1201 12:44:49.422574 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:49 crc kubenswrapper[4958]: I1201 12:44:49.655612 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:49 crc kubenswrapper[4958]: I1201 12:44:49.719602 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgkcp"] Dec 01 12:44:51 crc kubenswrapper[4958]: I1201 12:44:51.619688 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wgkcp" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="registry-server" containerID="cri-o://bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e" gracePeriod=2 Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.108775 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.289754 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-catalog-content\") pod \"e2875394-d190-4c98-816b-05c8c139d9dc\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.289992 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-utilities\") pod \"e2875394-d190-4c98-816b-05c8c139d9dc\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.290143 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5pd7\" (UniqueName: \"kubernetes.io/projected/e2875394-d190-4c98-816b-05c8c139d9dc-kube-api-access-r5pd7\") pod \"e2875394-d190-4c98-816b-05c8c139d9dc\" (UID: \"e2875394-d190-4c98-816b-05c8c139d9dc\") " Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.291455 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-utilities" (OuterVolumeSpecName: "utilities") pod "e2875394-d190-4c98-816b-05c8c139d9dc" (UID: "e2875394-d190-4c98-816b-05c8c139d9dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.392947 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.473033 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2875394-d190-4c98-816b-05c8c139d9dc" (UID: "e2875394-d190-4c98-816b-05c8c139d9dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.494555 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2875394-d190-4c98-816b-05c8c139d9dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.634431 4958 generic.go:334] "Generic (PLEG): container finished" podID="e2875394-d190-4c98-816b-05c8c139d9dc" containerID="bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e" exitCode=0 Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.634481 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkcp" event={"ID":"e2875394-d190-4c98-816b-05c8c139d9dc","Type":"ContainerDied","Data":"bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e"} Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.634518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkcp" event={"ID":"e2875394-d190-4c98-816b-05c8c139d9dc","Type":"ContainerDied","Data":"b6e2fe3a100813b3c3ef3fe8d58882196703353313f63a947cb417f78de55384"} Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.634543 4958 scope.go:117] "RemoveContainer" containerID="bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.634714 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkcp" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.663052 4958 scope.go:117] "RemoveContainer" containerID="9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.863821 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2875394-d190-4c98-816b-05c8c139d9dc-kube-api-access-r5pd7" (OuterVolumeSpecName: "kube-api-access-r5pd7") pod "e2875394-d190-4c98-816b-05c8c139d9dc" (UID: "e2875394-d190-4c98-816b-05c8c139d9dc"). InnerVolumeSpecName "kube-api-access-r5pd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.886252 4958 scope.go:117] "RemoveContainer" containerID="34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916" Dec 01 12:44:52 crc kubenswrapper[4958]: I1201 12:44:52.904296 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5pd7\" (UniqueName: \"kubernetes.io/projected/e2875394-d190-4c98-816b-05c8c139d9dc-kube-api-access-r5pd7\") on node \"crc\" DevicePath \"\"" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.006540 4958 scope.go:117] "RemoveContainer" containerID="bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e" Dec 01 12:44:53 crc kubenswrapper[4958]: E1201 12:44:53.007646 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e\": container with ID starting with bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e not found: ID does not exist" containerID="bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.007777 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e"} err="failed to get container status \"bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e\": rpc error: code = NotFound desc = could not find container \"bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e\": container with ID starting with bd8cb412a9fc3514350d7e3007e1b13ae0eaa7576c202cd61f58d85b1fc9671e not found: ID does not exist" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.007816 4958 scope.go:117] "RemoveContainer" containerID="9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac" Dec 01 12:44:53 crc kubenswrapper[4958]: E1201 12:44:53.008292 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac\": container with ID starting with 9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac not found: ID does not exist" containerID="9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.008334 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac"} err="failed to get container status \"9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac\": rpc error: code = NotFound desc = could not find container \"9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac\": container with ID starting with 9d025b44c25815ea52a6a5cbef2f1d25026d5a0fad60e330483d7721088773ac not found: ID does not exist" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.008363 4958 scope.go:117] "RemoveContainer" containerID="34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916" Dec 01 12:44:53 crc kubenswrapper[4958]: E1201 12:44:53.008678 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916\": container with ID starting with 34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916 not found: ID does not exist" containerID="34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.008718 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916"} err="failed to get container status \"34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916\": rpc error: code = NotFound desc = could not find container \"34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916\": container with ID starting with 34ac1bf7270458f82b5e10fbecd8d0ff43511b302a70ca157c108e26d4d23916 not found: ID does not exist" Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.056345 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgkcp"] Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.071202 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wgkcp"] Dec 01 12:44:53 crc kubenswrapper[4958]: I1201 12:44:53.835540 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" path="/var/lib/kubelet/pods/e2875394-d190-4c98-816b-05c8c139d9dc/volumes" Dec 01 12:44:55 crc kubenswrapper[4958]: I1201 12:44:55.797973 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:44:55 crc kubenswrapper[4958]: E1201 12:44:55.798800 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.153537 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8"] Dec 01 12:45:00 crc kubenswrapper[4958]: E1201 12:45:00.155762 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="registry-server" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.155803 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="registry-server" Dec 01 12:45:00 crc kubenswrapper[4958]: E1201 12:45:00.155814 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="extract-utilities" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.155823 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="extract-utilities" Dec 01 12:45:00 crc kubenswrapper[4958]: E1201 12:45:00.155864 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="extract-content" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.155875 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="extract-content" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.156170 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2875394-d190-4c98-816b-05c8c139d9dc" containerName="registry-server" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.157169 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.159256 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.159272 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.182399 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8"] Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.217454 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-secret-volume\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.217555 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lj5\" (UniqueName: \"kubernetes.io/projected/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-kube-api-access-b8lj5\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.217802 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-config-volume\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.320636 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lj5\" (UniqueName: \"kubernetes.io/projected/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-kube-api-access-b8lj5\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.320808 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-config-volume\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.320954 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-secret-volume\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.321964 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-config-volume\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.330716 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-secret-volume\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.338480 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lj5\" (UniqueName: \"kubernetes.io/projected/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-kube-api-access-b8lj5\") pod \"collect-profiles-29409885-w2cl8\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:00 crc kubenswrapper[4958]: I1201 12:45:00.489738 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:01 crc kubenswrapper[4958]: I1201 12:45:01.048855 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8"] Dec 01 12:45:01 crc kubenswrapper[4958]: I1201 12:45:01.785657 4958 generic.go:334] "Generic (PLEG): container finished" podID="a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" containerID="a54bde6efca0b82833503342e63a0258b1647bc06b7885f24678e299d1317354" exitCode=0 Dec 01 12:45:01 crc kubenswrapper[4958]: I1201 12:45:01.785747 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" event={"ID":"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858","Type":"ContainerDied","Data":"a54bde6efca0b82833503342e63a0258b1647bc06b7885f24678e299d1317354"} Dec 01 12:45:01 crc kubenswrapper[4958]: I1201 12:45:01.786071 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" event={"ID":"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858","Type":"ContainerStarted","Data":"d766b193283d4c323af7e3265b163fd17fd6b22042647d78debc0098bad9d9c5"} Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.345659 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.449969 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8lj5\" (UniqueName: \"kubernetes.io/projected/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-kube-api-access-b8lj5\") pod \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.450418 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-secret-volume\") pod \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.450567 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-config-volume\") pod \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\" (UID: \"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858\") " Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.452926 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-config-volume" (OuterVolumeSpecName: "config-volume") pod "a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" (UID: "a10faf47-bc5f-4cd5-ae0a-3536fc2c4858"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.453153 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.458509 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-kube-api-access-b8lj5" (OuterVolumeSpecName: "kube-api-access-b8lj5") pod "a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" (UID: "a10faf47-bc5f-4cd5-ae0a-3536fc2c4858"). InnerVolumeSpecName "kube-api-access-b8lj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.459764 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" (UID: "a10faf47-bc5f-4cd5-ae0a-3536fc2c4858"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.556378 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.556439 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8lj5\" (UniqueName: \"kubernetes.io/projected/a10faf47-bc5f-4cd5-ae0a-3536fc2c4858-kube-api-access-b8lj5\") on node \"crc\" DevicePath \"\"" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.816454 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.832403 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409885-w2cl8" event={"ID":"a10faf47-bc5f-4cd5-ae0a-3536fc2c4858","Type":"ContainerDied","Data":"d766b193283d4c323af7e3265b163fd17fd6b22042647d78debc0098bad9d9c5"} Dec 01 12:45:03 crc kubenswrapper[4958]: I1201 12:45:03.832485 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d766b193283d4c323af7e3265b163fd17fd6b22042647d78debc0098bad9d9c5" Dec 01 12:45:04 crc kubenswrapper[4958]: I1201 12:45:04.431282 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j"] Dec 01 12:45:04 crc kubenswrapper[4958]: I1201 12:45:04.440590 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409840-rd54j"] Dec 01 12:45:05 crc kubenswrapper[4958]: I1201 12:45:05.815737 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf906b4-61cb-48da-b22b-20f5c88b66a5" path="/var/lib/kubelet/pods/9cf906b4-61cb-48da-b22b-20f5c88b66a5/volumes" Dec 01 12:45:06 crc kubenswrapper[4958]: I1201 12:45:06.798364 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:45:06 crc kubenswrapper[4958]: E1201 12:45:06.799254 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:45:17 crc kubenswrapper[4958]: I1201 12:45:17.798167 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:45:17 crc kubenswrapper[4958]: E1201 12:45:17.800255 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:45:30 crc kubenswrapper[4958]: I1201 12:45:30.797741 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:45:30 crc kubenswrapper[4958]: E1201 12:45:30.798474 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:45:33 crc kubenswrapper[4958]: I1201 12:45:33.345913 4958 scope.go:117] "RemoveContainer" containerID="f57e4206952138b8cfda8beb9a17ef367c600e721c393af5242c3bca011330fb" Dec 01 12:45:43 crc kubenswrapper[4958]: I1201 12:45:43.807712 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:45:43 crc kubenswrapper[4958]: E1201 12:45:43.808733 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:45:56 crc kubenswrapper[4958]: I1201 12:45:56.797450 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:45:56 crc kubenswrapper[4958]: E1201 12:45:56.798431 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:46:11 crc kubenswrapper[4958]: I1201 12:46:11.798325 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:46:11 crc kubenswrapper[4958]: E1201 12:46:11.799215 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:46:25 crc kubenswrapper[4958]: I1201 12:46:25.800468 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:46:25 crc kubenswrapper[4958]: E1201 12:46:25.801277 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:46:40 crc kubenswrapper[4958]: I1201 12:46:40.798084 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:46:40 crc kubenswrapper[4958]: E1201 12:46:40.799012 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:46:52 crc kubenswrapper[4958]: I1201 12:46:52.798320 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:46:52 crc kubenswrapper[4958]: E1201 12:46:52.799164 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:47:07 crc kubenswrapper[4958]: I1201 12:47:07.797805 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:47:07 crc kubenswrapper[4958]: E1201 12:47:07.798767 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:47:18 crc kubenswrapper[4958]: I1201 12:47:18.798035 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:47:18 crc kubenswrapper[4958]: E1201 12:47:18.799190 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:47:30 crc kubenswrapper[4958]: I1201 12:47:30.798023 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:47:30 crc kubenswrapper[4958]: E1201 12:47:30.798703 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:47:43 crc kubenswrapper[4958]: I1201 12:47:43.812814 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:47:43 crc kubenswrapper[4958]: E1201 12:47:43.813954 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:47:54 crc kubenswrapper[4958]: I1201 12:47:54.799244 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:47:54 crc kubenswrapper[4958]: E1201 12:47:54.800655 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:48:05 crc kubenswrapper[4958]: I1201 12:48:05.799117 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:48:05 crc kubenswrapper[4958]: E1201 12:48:05.803475 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:48:16 crc kubenswrapper[4958]: I1201 12:48:16.797484 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:48:16 crc kubenswrapper[4958]: E1201 12:48:16.798715 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:48:29 crc kubenswrapper[4958]: I1201 12:48:29.798998 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:48:30 crc kubenswrapper[4958]: I1201 12:48:30.689582 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"8fdf0d21f9aae2e77dfd83f361949c9385cb5d79b387e817f1a93b26366cb3e7"} Dec 01 12:49:06 crc kubenswrapper[4958]: I1201 12:49:06.563293 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="05601e59-50a7-45b6-a41d-d872a023490b" containerName="galera" probeResult="failure" output="command timed out" Dec 01 12:49:06 crc kubenswrapper[4958]: I1201 12:49:06.778787 4958 trace.go:236] Trace[2025208374]: "Calculate volume metrics of mariadb-data for pod openstack/mariadb-copy-data" (01-Dec-2025 12:49:05.647) (total time: 1131ms): Dec 01 12:49:06 crc kubenswrapper[4958]: Trace[2025208374]: [1.131050009s] [1.131050009s] END Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.188957 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vxbzk"] Dec 01 12:49:40 crc kubenswrapper[4958]: E1201 12:49:40.190053 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" containerName="collect-profiles" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.190071 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" containerName="collect-profiles" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.190340 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10faf47-bc5f-4cd5-ae0a-3536fc2c4858" containerName="collect-profiles" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.192166 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.229391 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vxbzk"] Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.306454 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxtfm\" (UniqueName: \"kubernetes.io/projected/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-kube-api-access-pxtfm\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.306563 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-utilities\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.306656 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-catalog-content\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.409097 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxtfm\" (UniqueName: \"kubernetes.io/projected/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-kube-api-access-pxtfm\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.409173 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-utilities\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.409234 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-catalog-content\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.410010 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-catalog-content\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.410011 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-utilities\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.428899 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxtfm\" (UniqueName: \"kubernetes.io/projected/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-kube-api-access-pxtfm\") pod \"certified-operators-vxbzk\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:40 crc kubenswrapper[4958]: I1201 12:49:40.536530 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:41 crc kubenswrapper[4958]: I1201 12:49:41.134252 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vxbzk"] Dec 01 12:49:41 crc kubenswrapper[4958]: I1201 12:49:41.991395 4958 generic.go:334] "Generic (PLEG): container finished" podID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerID="c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3" exitCode=0 Dec 01 12:49:41 crc kubenswrapper[4958]: I1201 12:49:41.991485 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerDied","Data":"c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3"} Dec 01 12:49:41 crc kubenswrapper[4958]: I1201 12:49:41.991923 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerStarted","Data":"d0758faa4940c6523302efa824b10fadda5bd54da4ced60418e734e225e1f41a"} Dec 01 12:49:41 crc kubenswrapper[4958]: I1201 12:49:41.996007 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:49:43 crc kubenswrapper[4958]: I1201 12:49:43.006379 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerStarted","Data":"e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707"} Dec 01 12:49:44 crc kubenswrapper[4958]: I1201 12:49:44.027961 4958 generic.go:334] "Generic (PLEG): container finished" podID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerID="e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707" exitCode=0 Dec 01 12:49:44 crc kubenswrapper[4958]: I1201 12:49:44.028014 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerDied","Data":"e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707"} Dec 01 12:49:45 crc kubenswrapper[4958]: I1201 12:49:45.041969 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerStarted","Data":"35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73"} Dec 01 12:49:45 crc kubenswrapper[4958]: I1201 12:49:45.083602 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vxbzk" podStartSLOduration=2.596529026 podStartE2EDuration="5.083567459s" podCreationTimestamp="2025-12-01 12:49:40 +0000 UTC" firstStartedPulling="2025-12-01 12:49:41.995649847 +0000 UTC m=+10229.504438884" lastFinishedPulling="2025-12-01 12:49:44.48268828 +0000 UTC m=+10231.991477317" observedRunningTime="2025-12-01 12:49:45.064430521 +0000 UTC m=+10232.573219558" watchObservedRunningTime="2025-12-01 12:49:45.083567459 +0000 UTC m=+10232.592356546" Dec 01 12:49:50 crc kubenswrapper[4958]: I1201 12:49:50.537810 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:50 crc kubenswrapper[4958]: I1201 12:49:50.538524 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:50 crc kubenswrapper[4958]: I1201 12:49:50.883311 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:51 crc kubenswrapper[4958]: I1201 12:49:51.233171 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:51 crc kubenswrapper[4958]: I1201 12:49:51.301620 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vxbzk"] Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.184333 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vxbzk" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="registry-server" containerID="cri-o://35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73" gracePeriod=2 Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.556449 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2g7v"] Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.561689 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.588117 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2g7v"] Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.607720 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbqn\" (UniqueName: \"kubernetes.io/projected/9305c71e-fda0-4b0e-976b-426044923681-kube-api-access-gqbqn\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.608540 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-catalog-content\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.614571 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-utilities\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.719320 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbqn\" (UniqueName: \"kubernetes.io/projected/9305c71e-fda0-4b0e-976b-426044923681-kube-api-access-gqbqn\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.719486 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-catalog-content\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.719508 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-utilities\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.720309 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-catalog-content\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.720556 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-utilities\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.739936 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbqn\" (UniqueName: \"kubernetes.io/projected/9305c71e-fda0-4b0e-976b-426044923681-kube-api-access-gqbqn\") pod \"redhat-operators-n2g7v\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.798165 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.821133 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxtfm\" (UniqueName: \"kubernetes.io/projected/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-kube-api-access-pxtfm\") pod \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.821282 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-catalog-content\") pod \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.821334 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-utilities\") pod \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\" (UID: \"3df4c07d-8e68-4c9d-a78f-330fa3e6b114\") " Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.822340 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-utilities" (OuterVolumeSpecName: "utilities") pod "3df4c07d-8e68-4c9d-a78f-330fa3e6b114" (UID: "3df4c07d-8e68-4c9d-a78f-330fa3e6b114"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.829939 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-kube-api-access-pxtfm" (OuterVolumeSpecName: "kube-api-access-pxtfm") pod "3df4c07d-8e68-4c9d-a78f-330fa3e6b114" (UID: "3df4c07d-8e68-4c9d-a78f-330fa3e6b114"). InnerVolumeSpecName "kube-api-access-pxtfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.832144 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.832174 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxtfm\" (UniqueName: \"kubernetes.io/projected/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-kube-api-access-pxtfm\") on node \"crc\" DevicePath \"\"" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.881452 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3df4c07d-8e68-4c9d-a78f-330fa3e6b114" (UID: "3df4c07d-8e68-4c9d-a78f-330fa3e6b114"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.900688 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:49:53 crc kubenswrapper[4958]: I1201 12:49:53.934216 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3df4c07d-8e68-4c9d-a78f-330fa3e6b114-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.216396 4958 generic.go:334] "Generic (PLEG): container finished" podID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerID="35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73" exitCode=0 Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.216657 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerDied","Data":"35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73"} Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.216839 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxbzk" event={"ID":"3df4c07d-8e68-4c9d-a78f-330fa3e6b114","Type":"ContainerDied","Data":"d0758faa4940c6523302efa824b10fadda5bd54da4ced60418e734e225e1f41a"} Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.216892 4958 scope.go:117] "RemoveContainer" containerID="35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.216697 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxbzk" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.270460 4958 scope.go:117] "RemoveContainer" containerID="e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.282671 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vxbzk"] Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.293875 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vxbzk"] Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.300650 4958 scope.go:117] "RemoveContainer" containerID="c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.371106 4958 scope.go:117] "RemoveContainer" containerID="35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73" Dec 01 12:49:54 crc kubenswrapper[4958]: E1201 12:49:54.372161 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73\": container with ID starting with 35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73 not found: ID does not exist" containerID="35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.372203 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73"} err="failed to get container status \"35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73\": rpc error: code = NotFound desc = could not find container \"35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73\": container with ID starting with 35e884839a094f9ffb015bf8ed072fff0ca20a94558b7aeb1a02326993006b73 not found: ID does not exist" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.372231 4958 scope.go:117] "RemoveContainer" containerID="e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707" Dec 01 12:49:54 crc kubenswrapper[4958]: E1201 12:49:54.372515 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707\": container with ID starting with e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707 not found: ID does not exist" containerID="e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.372544 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707"} err="failed to get container status \"e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707\": rpc error: code = NotFound desc = could not find container \"e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707\": container with ID starting with e5e9abbd72be346dfba7d529995da0b80506f0f0e9c0413872d8bb516d76e707 not found: ID does not exist" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.372562 4958 scope.go:117] "RemoveContainer" containerID="c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3" Dec 01 12:49:54 crc kubenswrapper[4958]: E1201 12:49:54.372815 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3\": container with ID starting with c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3 not found: ID does not exist" containerID="c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.372858 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3"} err="failed to get container status \"c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3\": rpc error: code = NotFound desc = could not find container \"c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3\": container with ID starting with c3b7cd7c0911449a19f572dfe71fc511f18da4c5dafa07b13a8ee1c1ccedebf3 not found: ID does not exist" Dec 01 12:49:54 crc kubenswrapper[4958]: I1201 12:49:54.450487 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2g7v"] Dec 01 12:49:55 crc kubenswrapper[4958]: I1201 12:49:55.231567 4958 generic.go:334] "Generic (PLEG): container finished" podID="9305c71e-fda0-4b0e-976b-426044923681" containerID="0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3" exitCode=0 Dec 01 12:49:55 crc kubenswrapper[4958]: I1201 12:49:55.231765 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerDied","Data":"0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3"} Dec 01 12:49:55 crc kubenswrapper[4958]: I1201 12:49:55.231861 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerStarted","Data":"0772a66f1ef6dab8de768a3ee1f58a4c99419dc7247b8b67a5dda293b67ff052"} Dec 01 12:49:55 crc kubenswrapper[4958]: I1201 12:49:55.809884 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" path="/var/lib/kubelet/pods/3df4c07d-8e68-4c9d-a78f-330fa3e6b114/volumes" Dec 01 12:49:57 crc kubenswrapper[4958]: I1201 12:49:57.269292 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerStarted","Data":"7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c"} Dec 01 12:49:59 crc kubenswrapper[4958]: I1201 12:49:59.293709 4958 generic.go:334] "Generic (PLEG): container finished" podID="9305c71e-fda0-4b0e-976b-426044923681" containerID="7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c" exitCode=0 Dec 01 12:49:59 crc kubenswrapper[4958]: I1201 12:49:59.293802 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerDied","Data":"7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c"} Dec 01 12:50:01 crc kubenswrapper[4958]: I1201 12:50:01.370950 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerStarted","Data":"cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad"} Dec 01 12:50:01 crc kubenswrapper[4958]: I1201 12:50:01.392604 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2g7v" podStartSLOduration=3.547417299 podStartE2EDuration="8.392581321s" podCreationTimestamp="2025-12-01 12:49:53 +0000 UTC" firstStartedPulling="2025-12-01 12:49:55.23535871 +0000 UTC m=+10242.744147747" lastFinishedPulling="2025-12-01 12:50:00.080522712 +0000 UTC m=+10247.589311769" observedRunningTime="2025-12-01 12:50:01.388749023 +0000 UTC m=+10248.897538070" watchObservedRunningTime="2025-12-01 12:50:01.392581321 +0000 UTC m=+10248.901370368" Dec 01 12:50:03 crc kubenswrapper[4958]: I1201 12:50:03.902194 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:50:03 crc kubenswrapper[4958]: I1201 12:50:03.902674 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:50:05 crc kubenswrapper[4958]: I1201 12:50:05.339791 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n2g7v" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="registry-server" probeResult="failure" output=< Dec 01 12:50:05 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 12:50:05 crc kubenswrapper[4958]: > Dec 01 12:50:13 crc kubenswrapper[4958]: I1201 12:50:13.998402 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:50:14 crc kubenswrapper[4958]: I1201 12:50:14.090317 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:50:14 crc kubenswrapper[4958]: I1201 12:50:14.253813 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2g7v"] Dec 01 12:50:15 crc kubenswrapper[4958]: I1201 12:50:15.544169 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2g7v" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="registry-server" containerID="cri-o://cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad" gracePeriod=2 Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.057561 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.175168 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-utilities\") pod \"9305c71e-fda0-4b0e-976b-426044923681\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.175236 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbqn\" (UniqueName: \"kubernetes.io/projected/9305c71e-fda0-4b0e-976b-426044923681-kube-api-access-gqbqn\") pod \"9305c71e-fda0-4b0e-976b-426044923681\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.175323 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-catalog-content\") pod \"9305c71e-fda0-4b0e-976b-426044923681\" (UID: \"9305c71e-fda0-4b0e-976b-426044923681\") " Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.176285 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-utilities" (OuterVolumeSpecName: "utilities") pod "9305c71e-fda0-4b0e-976b-426044923681" (UID: "9305c71e-fda0-4b0e-976b-426044923681"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.278491 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.318183 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9305c71e-fda0-4b0e-976b-426044923681" (UID: "9305c71e-fda0-4b0e-976b-426044923681"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.381357 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9305c71e-fda0-4b0e-976b-426044923681-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.562347 4958 generic.go:334] "Generic (PLEG): container finished" podID="9305c71e-fda0-4b0e-976b-426044923681" containerID="cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad" exitCode=0 Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.562441 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerDied","Data":"cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad"} Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.562469 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2g7v" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.562494 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2g7v" event={"ID":"9305c71e-fda0-4b0e-976b-426044923681","Type":"ContainerDied","Data":"0772a66f1ef6dab8de768a3ee1f58a4c99419dc7247b8b67a5dda293b67ff052"} Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.562590 4958 scope.go:117] "RemoveContainer" containerID="cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.607801 4958 scope.go:117] "RemoveContainer" containerID="7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.767196 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9305c71e-fda0-4b0e-976b-426044923681-kube-api-access-gqbqn" (OuterVolumeSpecName: "kube-api-access-gqbqn") pod "9305c71e-fda0-4b0e-976b-426044923681" (UID: "9305c71e-fda0-4b0e-976b-426044923681"). InnerVolumeSpecName "kube-api-access-gqbqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.791208 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbqn\" (UniqueName: \"kubernetes.io/projected/9305c71e-fda0-4b0e-976b-426044923681-kube-api-access-gqbqn\") on node \"crc\" DevicePath \"\"" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.809209 4958 scope.go:117] "RemoveContainer" containerID="0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.874342 4958 scope.go:117] "RemoveContainer" containerID="cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad" Dec 01 12:50:16 crc kubenswrapper[4958]: E1201 12:50:16.875336 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad\": container with ID starting with cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad not found: ID does not exist" containerID="cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.875376 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad"} err="failed to get container status \"cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad\": rpc error: code = NotFound desc = could not find container \"cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad\": container with ID starting with cbd5ddd884ac9a9642ac20417a03222cb4970429c57bc71ef9979a27681664ad not found: ID does not exist" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.875398 4958 scope.go:117] "RemoveContainer" containerID="7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c" Dec 01 12:50:16 crc kubenswrapper[4958]: E1201 12:50:16.875792 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c\": container with ID starting with 7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c not found: ID does not exist" containerID="7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.875821 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c"} err="failed to get container status \"7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c\": rpc error: code = NotFound desc = could not find container \"7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c\": container with ID starting with 7b0c8901ee1f77cd3005f66ef8ac274cbdc9c952e7eaddbba6b62709596c8a2c not found: ID does not exist" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.875863 4958 scope.go:117] "RemoveContainer" containerID="0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3" Dec 01 12:50:16 crc kubenswrapper[4958]: E1201 12:50:16.876332 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3\": container with ID starting with 0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3 not found: ID does not exist" containerID="0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.876353 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3"} err="failed to get container status \"0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3\": rpc error: code = NotFound desc = could not find container \"0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3\": container with ID starting with 0c59e4718d32740b349cfe636186ee712f0dd8ae31d12f93443596c99724adc3 not found: ID does not exist" Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.952070 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2g7v"] Dec 01 12:50:16 crc kubenswrapper[4958]: I1201 12:50:16.968730 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2g7v"] Dec 01 12:50:17 crc kubenswrapper[4958]: I1201 12:50:17.852674 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9305c71e-fda0-4b0e-976b-426044923681" path="/var/lib/kubelet/pods/9305c71e-fda0-4b0e-976b-426044923681/volumes" Dec 01 12:50:58 crc kubenswrapper[4958]: I1201 12:50:58.211209 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:50:58 crc kubenswrapper[4958]: I1201 12:50:58.211985 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.645552 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rwwd6"] Dec 01 12:51:26 crc kubenswrapper[4958]: E1201 12:51:26.647006 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="extract-utilities" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647030 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="extract-utilities" Dec 01 12:51:26 crc kubenswrapper[4958]: E1201 12:51:26.647059 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="extract-utilities" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647072 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="extract-utilities" Dec 01 12:51:26 crc kubenswrapper[4958]: E1201 12:51:26.647105 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="registry-server" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647117 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="registry-server" Dec 01 12:51:26 crc kubenswrapper[4958]: E1201 12:51:26.647165 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="extract-content" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647178 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="extract-content" Dec 01 12:51:26 crc kubenswrapper[4958]: E1201 12:51:26.647215 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="extract-content" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647228 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="extract-content" Dec 01 12:51:26 crc kubenswrapper[4958]: E1201 12:51:26.647256 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="registry-server" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647268 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="registry-server" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647889 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9305c71e-fda0-4b0e-976b-426044923681" containerName="registry-server" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.647920 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df4c07d-8e68-4c9d-a78f-330fa3e6b114" containerName="registry-server" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.652638 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.665169 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwwd6"] Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.832001 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-utilities\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.832086 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqlkq\" (UniqueName: \"kubernetes.io/projected/24bc641f-7270-472f-8158-7d09bd42e254-kube-api-access-kqlkq\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.832144 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-catalog-content\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.933993 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqlkq\" (UniqueName: \"kubernetes.io/projected/24bc641f-7270-472f-8158-7d09bd42e254-kube-api-access-kqlkq\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.934437 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-catalog-content\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.935369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-utilities\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.935701 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-utilities\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.936118 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-catalog-content\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.958897 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqlkq\" (UniqueName: \"kubernetes.io/projected/24bc641f-7270-472f-8158-7d09bd42e254-kube-api-access-kqlkq\") pod \"redhat-marketplace-rwwd6\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:26 crc kubenswrapper[4958]: I1201 12:51:26.996736 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:27 crc kubenswrapper[4958]: I1201 12:51:27.569802 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwwd6"] Dec 01 12:51:27 crc kubenswrapper[4958]: I1201 12:51:27.910951 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerStarted","Data":"075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26"} Dec 01 12:51:27 crc kubenswrapper[4958]: I1201 12:51:27.911321 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerStarted","Data":"68b7ffb4fe725414781bad2357f75dd75b5f159b13b7e58d3d7e4b9a37b06f06"} Dec 01 12:51:28 crc kubenswrapper[4958]: I1201 12:51:28.218096 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:51:28 crc kubenswrapper[4958]: I1201 12:51:28.218151 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:51:28 crc kubenswrapper[4958]: I1201 12:51:28.932301 4958 generic.go:334] "Generic (PLEG): container finished" podID="24bc641f-7270-472f-8158-7d09bd42e254" containerID="075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26" exitCode=0 Dec 01 12:51:28 crc kubenswrapper[4958]: I1201 12:51:28.932472 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerDied","Data":"075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26"} Dec 01 12:51:29 crc kubenswrapper[4958]: I1201 12:51:29.947679 4958 generic.go:334] "Generic (PLEG): container finished" podID="24bc641f-7270-472f-8158-7d09bd42e254" containerID="33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9" exitCode=0 Dec 01 12:51:29 crc kubenswrapper[4958]: I1201 12:51:29.947774 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerDied","Data":"33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9"} Dec 01 12:51:30 crc kubenswrapper[4958]: I1201 12:51:30.964408 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerStarted","Data":"954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9"} Dec 01 12:51:30 crc kubenswrapper[4958]: I1201 12:51:30.993067 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rwwd6" podStartSLOduration=2.433425079 podStartE2EDuration="4.993013232s" podCreationTimestamp="2025-12-01 12:51:26 +0000 UTC" firstStartedPulling="2025-12-01 12:51:27.916661864 +0000 UTC m=+10335.425450911" lastFinishedPulling="2025-12-01 12:51:30.476250017 +0000 UTC m=+10337.985039064" observedRunningTime="2025-12-01 12:51:30.984120382 +0000 UTC m=+10338.492909429" watchObservedRunningTime="2025-12-01 12:51:30.993013232 +0000 UTC m=+10338.501802309" Dec 01 12:51:32 crc kubenswrapper[4958]: I1201 12:51:32.668122 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-q928n" podUID="7513d038-3642-4f27-a2df-c932dc6c9eaa" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 12:51:36 crc kubenswrapper[4958]: I1201 12:51:36.997534 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:36 crc kubenswrapper[4958]: I1201 12:51:36.998313 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:37 crc kubenswrapper[4958]: I1201 12:51:37.092491 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:37 crc kubenswrapper[4958]: I1201 12:51:37.206872 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:37 crc kubenswrapper[4958]: I1201 12:51:37.352083 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwwd6"] Dec 01 12:51:39 crc kubenswrapper[4958]: I1201 12:51:39.165044 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rwwd6" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="registry-server" containerID="cri-o://954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9" gracePeriod=2 Dec 01 12:51:39 crc kubenswrapper[4958]: I1201 12:51:39.995587 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.158182 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqlkq\" (UniqueName: \"kubernetes.io/projected/24bc641f-7270-472f-8158-7d09bd42e254-kube-api-access-kqlkq\") pod \"24bc641f-7270-472f-8158-7d09bd42e254\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.158281 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-utilities\") pod \"24bc641f-7270-472f-8158-7d09bd42e254\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.158398 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-catalog-content\") pod \"24bc641f-7270-472f-8158-7d09bd42e254\" (UID: \"24bc641f-7270-472f-8158-7d09bd42e254\") " Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.160105 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-utilities" (OuterVolumeSpecName: "utilities") pod "24bc641f-7270-472f-8158-7d09bd42e254" (UID: "24bc641f-7270-472f-8158-7d09bd42e254"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.165134 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bc641f-7270-472f-8158-7d09bd42e254-kube-api-access-kqlkq" (OuterVolumeSpecName: "kube-api-access-kqlkq") pod "24bc641f-7270-472f-8158-7d09bd42e254" (UID: "24bc641f-7270-472f-8158-7d09bd42e254"). InnerVolumeSpecName "kube-api-access-kqlkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.181062 4958 generic.go:334] "Generic (PLEG): container finished" podID="24bc641f-7270-472f-8158-7d09bd42e254" containerID="954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9" exitCode=0 Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.181111 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerDied","Data":"954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9"} Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.181143 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwwd6" event={"ID":"24bc641f-7270-472f-8158-7d09bd42e254","Type":"ContainerDied","Data":"68b7ffb4fe725414781bad2357f75dd75b5f159b13b7e58d3d7e4b9a37b06f06"} Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.181164 4958 scope.go:117] "RemoveContainer" containerID="954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.181315 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwwd6" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.189160 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24bc641f-7270-472f-8158-7d09bd42e254" (UID: "24bc641f-7270-472f-8158-7d09bd42e254"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.256221 4958 scope.go:117] "RemoveContainer" containerID="33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.261067 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqlkq\" (UniqueName: \"kubernetes.io/projected/24bc641f-7270-472f-8158-7d09bd42e254-kube-api-access-kqlkq\") on node \"crc\" DevicePath \"\"" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.261118 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.261130 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bc641f-7270-472f-8158-7d09bd42e254-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.288605 4958 scope.go:117] "RemoveContainer" containerID="075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.341354 4958 scope.go:117] "RemoveContainer" containerID="954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9" Dec 01 12:51:40 crc kubenswrapper[4958]: E1201 12:51:40.341970 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9\": container with ID starting with 954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9 not found: ID does not exist" containerID="954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.342057 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9"} err="failed to get container status \"954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9\": rpc error: code = NotFound desc = could not find container \"954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9\": container with ID starting with 954a066a090bbcb8b7299b8fb151803688f85e02080de1e7f4335c39d7f3bac9 not found: ID does not exist" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.342106 4958 scope.go:117] "RemoveContainer" containerID="33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9" Dec 01 12:51:40 crc kubenswrapper[4958]: E1201 12:51:40.342588 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9\": container with ID starting with 33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9 not found: ID does not exist" containerID="33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.342718 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9"} err="failed to get container status \"33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9\": rpc error: code = NotFound desc = could not find container \"33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9\": container with ID starting with 33dcb8b2e0604b0ecf45d5ac47bc9e457ac7616e80ea9a2a41d0bdbc5e580fb9 not found: ID does not exist" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.342824 4958 scope.go:117] "RemoveContainer" containerID="075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26" Dec 01 12:51:40 crc kubenswrapper[4958]: E1201 12:51:40.343515 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26\": container with ID starting with 075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26 not found: ID does not exist" containerID="075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.343561 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26"} err="failed to get container status \"075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26\": rpc error: code = NotFound desc = could not find container \"075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26\": container with ID starting with 075ddfc854c077fbd075ef735a22ada797fe6d15130b6d2d335f49bb47f67a26 not found: ID does not exist" Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.551781 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwwd6"] Dec 01 12:51:40 crc kubenswrapper[4958]: I1201 12:51:40.571489 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwwd6"] Dec 01 12:51:41 crc kubenswrapper[4958]: I1201 12:51:41.817009 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bc641f-7270-472f-8158-7d09bd42e254" path="/var/lib/kubelet/pods/24bc641f-7270-472f-8158-7d09bd42e254/volumes" Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.210762 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.211407 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.211478 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.212545 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fdf0d21f9aae2e77dfd83f361949c9385cb5d79b387e817f1a93b26366cb3e7"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.212629 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://8fdf0d21f9aae2e77dfd83f361949c9385cb5d79b387e817f1a93b26366cb3e7" gracePeriod=600 Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.450215 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="8fdf0d21f9aae2e77dfd83f361949c9385cb5d79b387e817f1a93b26366cb3e7" exitCode=0 Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.450263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"8fdf0d21f9aae2e77dfd83f361949c9385cb5d79b387e817f1a93b26366cb3e7"} Dec 01 12:51:58 crc kubenswrapper[4958]: I1201 12:51:58.450300 4958 scope.go:117] "RemoveContainer" containerID="5c5315329490e51f750dc8742e4e9f345c0959ef3da794e3edbaefada2f3743d" Dec 01 12:51:59 crc kubenswrapper[4958]: I1201 12:51:59.464185 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0"} Dec 01 12:52:36 crc kubenswrapper[4958]: I1201 12:52:36.025490 4958 trace.go:236] Trace[2111309417]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (01-Dec-2025 12:52:24.814) (total time: 11211ms): Dec 01 12:52:36 crc kubenswrapper[4958]: Trace[2111309417]: [11.211118811s] [11.211118811s] END Dec 01 12:52:36 crc kubenswrapper[4958]: I1201 12:52:36.025910 4958 trace.go:236] Trace[866579779]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (01-Dec-2025 12:52:24.408) (total time: 11617ms): Dec 01 12:52:36 crc kubenswrapper[4958]: Trace[866579779]: [11.617308434s] [11.617308434s] END Dec 01 12:52:36 crc kubenswrapper[4958]: I1201 12:52:36.026126 4958 trace.go:236] Trace[903500679]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (01-Dec-2025 12:52:26.147) (total time: 9878ms): Dec 01 12:52:36 crc kubenswrapper[4958]: Trace[903500679]: [9.878389241s] [9.878389241s] END Dec 01 12:52:36 crc kubenswrapper[4958]: I1201 12:52:36.025779 4958 trace.go:236] Trace[1013355173]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-2" (01-Dec-2025 12:52:17.901) (total time: 18124ms): Dec 01 12:52:36 crc kubenswrapper[4958]: Trace[1013355173]: [18.124646251s] [18.124646251s] END Dec 01 12:53:58 crc kubenswrapper[4958]: I1201 12:53:58.210773 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:53:58 crc kubenswrapper[4958]: I1201 12:53:58.211592 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:53:58 crc kubenswrapper[4958]: I1201 12:53:58.236815 4958 generic.go:334] "Generic (PLEG): container finished" podID="dd221c10-e588-4b0e-9b5e-94735a063bac" containerID="aa74fca35db7fb4b1fec0f381e4bfc69cf6fa978c3c59e409123a29963bca06d" exitCode=0 Dec 01 12:53:58 crc kubenswrapper[4958]: I1201 12:53:58.236942 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" event={"ID":"dd221c10-e588-4b0e-9b5e-94735a063bac","Type":"ContainerDied","Data":"aa74fca35db7fb4b1fec0f381e4bfc69cf6fa978c3c59e409123a29963bca06d"} Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.857172 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976098 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ssh-key\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976195 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-1\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976293 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-579r6\" (UniqueName: \"kubernetes.io/projected/dd221c10-e588-4b0e-9b5e-94735a063bac-kube-api-access-579r6\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976325 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-0\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976355 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ceph\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976386 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-0\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976428 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-1\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976475 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-1\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976532 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-inventory\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976579 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-combined-ca-bundle\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.976677 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-0\") pod \"dd221c10-e588-4b0e-9b5e-94735a063bac\" (UID: \"dd221c10-e588-4b0e-9b5e-94735a063bac\") " Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.983014 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ceph" (OuterVolumeSpecName: "ceph") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.983795 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd221c10-e588-4b0e-9b5e-94735a063bac-kube-api-access-579r6" (OuterVolumeSpecName: "kube-api-access-579r6") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "kube-api-access-579r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:53:59 crc kubenswrapper[4958]: I1201 12:53:59.996478 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.014932 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.015489 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.018836 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.020225 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.022801 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.025512 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.026944 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.034610 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-inventory" (OuterVolumeSpecName: "inventory") pod "dd221c10-e588-4b0e-9b5e-94735a063bac" (UID: "dd221c10-e588-4b0e-9b5e-94735a063bac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080509 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080552 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080562 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080573 4958 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080582 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080592 4958 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080604 4958 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080617 4958 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080625 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-579r6\" (UniqueName: \"kubernetes.io/projected/dd221c10-e588-4b0e-9b5e-94735a063bac-kube-api-access-579r6\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080633 4958 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.080642 4958 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd221c10-e588-4b0e-9b5e-94735a063bac-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.265799 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" event={"ID":"dd221c10-e588-4b0e-9b5e-94735a063bac","Type":"ContainerDied","Data":"51a181114a8d6e882633fe99914440841aa7988f1ec6178c8f1b70350d3288ff"} Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.265889 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51a181114a8d6e882633fe99914440841aa7988f1ec6178c8f1b70350d3288ff" Dec 01 12:54:00 crc kubenswrapper[4958]: I1201 12:54:00.265913 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth" Dec 01 12:54:27 crc kubenswrapper[4958]: E1201 12:54:27.350535 4958 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.216:52986->38.129.56.216:34693: write tcp 38.129.56.216:52986->38.129.56.216:34693: write: broken pipe Dec 01 12:54:28 crc kubenswrapper[4958]: I1201 12:54:28.211280 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:54:28 crc kubenswrapper[4958]: I1201 12:54:28.211787 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.183154 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9xs96"] Dec 01 12:54:40 crc kubenswrapper[4958]: E1201 12:54:40.187467 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="registry-server" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.187503 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="registry-server" Dec 01 12:54:40 crc kubenswrapper[4958]: E1201 12:54:40.187547 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="extract-utilities" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.187554 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="extract-utilities" Dec 01 12:54:40 crc kubenswrapper[4958]: E1201 12:54:40.187571 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="extract-content" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.187576 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="extract-content" Dec 01 12:54:40 crc kubenswrapper[4958]: E1201 12:54:40.187604 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd221c10-e588-4b0e-9b5e-94735a063bac" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.187614 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd221c10-e588-4b0e-9b5e-94735a063bac" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.187942 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd221c10-e588-4b0e-9b5e-94735a063bac" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.187966 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bc641f-7270-472f-8158-7d09bd42e254" containerName="registry-server" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.193553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.205465 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xs96"] Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.293558 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-catalog-content\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.293619 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9r4d\" (UniqueName: \"kubernetes.io/projected/164fd33a-a75c-4343-a362-2853c914b9fe-kube-api-access-w9r4d\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.293699 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-utilities\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.395463 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-utilities\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.395643 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-catalog-content\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.395686 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9r4d\" (UniqueName: \"kubernetes.io/projected/164fd33a-a75c-4343-a362-2853c914b9fe-kube-api-access-w9r4d\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.396449 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-utilities\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.396710 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-catalog-content\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.419543 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9r4d\" (UniqueName: \"kubernetes.io/projected/164fd33a-a75c-4343-a362-2853c914b9fe-kube-api-access-w9r4d\") pod \"community-operators-9xs96\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:40 crc kubenswrapper[4958]: I1201 12:54:40.516257 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:41 crc kubenswrapper[4958]: I1201 12:54:41.133612 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xs96"] Dec 01 12:54:41 crc kubenswrapper[4958]: W1201 12:54:41.135883 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164fd33a_a75c_4343_a362_2853c914b9fe.slice/crio-05ea4b8006980b8d9632de1d4dbd2b490c67fc1aa22b938fbc8779fe89038f20 WatchSource:0}: Error finding container 05ea4b8006980b8d9632de1d4dbd2b490c67fc1aa22b938fbc8779fe89038f20: Status 404 returned error can't find the container with id 05ea4b8006980b8d9632de1d4dbd2b490c67fc1aa22b938fbc8779fe89038f20 Dec 01 12:54:41 crc kubenswrapper[4958]: I1201 12:54:41.967207 4958 generic.go:334] "Generic (PLEG): container finished" podID="164fd33a-a75c-4343-a362-2853c914b9fe" containerID="f4cf078fcfff885a1c87c5788b91497cf931c4601529532510fac34209162f3d" exitCode=0 Dec 01 12:54:41 crc kubenswrapper[4958]: I1201 12:54:41.967298 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerDied","Data":"f4cf078fcfff885a1c87c5788b91497cf931c4601529532510fac34209162f3d"} Dec 01 12:54:41 crc kubenswrapper[4958]: I1201 12:54:41.967673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerStarted","Data":"05ea4b8006980b8d9632de1d4dbd2b490c67fc1aa22b938fbc8779fe89038f20"} Dec 01 12:54:43 crc kubenswrapper[4958]: I1201 12:54:43.994255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerStarted","Data":"aa1fc14c6f35fee85fd5b097c008ff515298a62ffbaaba1ec96d6d309df4c458"} Dec 01 12:54:45 crc kubenswrapper[4958]: I1201 12:54:45.005546 4958 generic.go:334] "Generic (PLEG): container finished" podID="164fd33a-a75c-4343-a362-2853c914b9fe" containerID="aa1fc14c6f35fee85fd5b097c008ff515298a62ffbaaba1ec96d6d309df4c458" exitCode=0 Dec 01 12:54:45 crc kubenswrapper[4958]: I1201 12:54:45.005617 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerDied","Data":"aa1fc14c6f35fee85fd5b097c008ff515298a62ffbaaba1ec96d6d309df4c458"} Dec 01 12:54:45 crc kubenswrapper[4958]: I1201 12:54:45.008697 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 12:54:46 crc kubenswrapper[4958]: I1201 12:54:46.016693 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerStarted","Data":"7f8e18c1bd25119556efd515c8023001d7e0c82357c5057526c415cd88783920"} Dec 01 12:54:46 crc kubenswrapper[4958]: I1201 12:54:46.037029 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9xs96" podStartSLOduration=2.455890518 podStartE2EDuration="6.037012721s" podCreationTimestamp="2025-12-01 12:54:40 +0000 UTC" firstStartedPulling="2025-12-01 12:54:41.969790147 +0000 UTC m=+10529.478579194" lastFinishedPulling="2025-12-01 12:54:45.55091232 +0000 UTC m=+10533.059701397" observedRunningTime="2025-12-01 12:54:46.033827581 +0000 UTC m=+10533.542616608" watchObservedRunningTime="2025-12-01 12:54:46.037012721 +0000 UTC m=+10533.545801758" Dec 01 12:54:50 crc kubenswrapper[4958]: I1201 12:54:50.517266 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:50 crc kubenswrapper[4958]: I1201 12:54:50.517769 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:50 crc kubenswrapper[4958]: I1201 12:54:50.646681 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:51 crc kubenswrapper[4958]: I1201 12:54:51.173016 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:54 crc kubenswrapper[4958]: I1201 12:54:54.179186 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xs96"] Dec 01 12:54:54 crc kubenswrapper[4958]: I1201 12:54:54.180204 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9xs96" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="registry-server" containerID="cri-o://7f8e18c1bd25119556efd515c8023001d7e0c82357c5057526c415cd88783920" gracePeriod=2 Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.169224 4958 generic.go:334] "Generic (PLEG): container finished" podID="164fd33a-a75c-4343-a362-2853c914b9fe" containerID="7f8e18c1bd25119556efd515c8023001d7e0c82357c5057526c415cd88783920" exitCode=0 Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.169741 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerDied","Data":"7f8e18c1bd25119556efd515c8023001d7e0c82357c5057526c415cd88783920"} Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.330198 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.449053 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-utilities\") pod \"164fd33a-a75c-4343-a362-2853c914b9fe\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.449147 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-catalog-content\") pod \"164fd33a-a75c-4343-a362-2853c914b9fe\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.449259 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9r4d\" (UniqueName: \"kubernetes.io/projected/164fd33a-a75c-4343-a362-2853c914b9fe-kube-api-access-w9r4d\") pod \"164fd33a-a75c-4343-a362-2853c914b9fe\" (UID: \"164fd33a-a75c-4343-a362-2853c914b9fe\") " Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.451150 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-utilities" (OuterVolumeSpecName: "utilities") pod "164fd33a-a75c-4343-a362-2853c914b9fe" (UID: "164fd33a-a75c-4343-a362-2853c914b9fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.465770 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164fd33a-a75c-4343-a362-2853c914b9fe-kube-api-access-w9r4d" (OuterVolumeSpecName: "kube-api-access-w9r4d") pod "164fd33a-a75c-4343-a362-2853c914b9fe" (UID: "164fd33a-a75c-4343-a362-2853c914b9fe"). InnerVolumeSpecName "kube-api-access-w9r4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.505529 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "164fd33a-a75c-4343-a362-2853c914b9fe" (UID: "164fd33a-a75c-4343-a362-2853c914b9fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.552206 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.552261 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9r4d\" (UniqueName: \"kubernetes.io/projected/164fd33a-a75c-4343-a362-2853c914b9fe-kube-api-access-w9r4d\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:55 crc kubenswrapper[4958]: I1201 12:54:55.552273 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164fd33a-a75c-4343-a362-2853c914b9fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.186879 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xs96" event={"ID":"164fd33a-a75c-4343-a362-2853c914b9fe","Type":"ContainerDied","Data":"05ea4b8006980b8d9632de1d4dbd2b490c67fc1aa22b938fbc8779fe89038f20"} Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.186932 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xs96" Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.187266 4958 scope.go:117] "RemoveContainer" containerID="7f8e18c1bd25119556efd515c8023001d7e0c82357c5057526c415cd88783920" Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.227156 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xs96"] Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.227647 4958 scope.go:117] "RemoveContainer" containerID="aa1fc14c6f35fee85fd5b097c008ff515298a62ffbaaba1ec96d6d309df4c458" Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.259811 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9xs96"] Dec 01 12:54:56 crc kubenswrapper[4958]: I1201 12:54:56.265605 4958 scope.go:117] "RemoveContainer" containerID="f4cf078fcfff885a1c87c5788b91497cf931c4601529532510fac34209162f3d" Dec 01 12:54:57 crc kubenswrapper[4958]: I1201 12:54:57.820636 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" path="/var/lib/kubelet/pods/164fd33a-a75c-4343-a362-2853c914b9fe/volumes" Dec 01 12:54:58 crc kubenswrapper[4958]: I1201 12:54:58.210220 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 12:54:58 crc kubenswrapper[4958]: I1201 12:54:58.210294 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 12:54:58 crc kubenswrapper[4958]: I1201 12:54:58.210348 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 12:54:58 crc kubenswrapper[4958]: I1201 12:54:58.211339 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 12:54:58 crc kubenswrapper[4958]: I1201 12:54:58.211420 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" gracePeriod=600 Dec 01 12:54:58 crc kubenswrapper[4958]: E1201 12:54:58.349142 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:54:59 crc kubenswrapper[4958]: I1201 12:54:59.236014 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" exitCode=0 Dec 01 12:54:59 crc kubenswrapper[4958]: I1201 12:54:59.236080 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0"} Dec 01 12:54:59 crc kubenswrapper[4958]: I1201 12:54:59.236133 4958 scope.go:117] "RemoveContainer" containerID="8fdf0d21f9aae2e77dfd83f361949c9385cb5d79b387e817f1a93b26366cb3e7" Dec 01 12:54:59 crc kubenswrapper[4958]: I1201 12:54:59.236965 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:54:59 crc kubenswrapper[4958]: E1201 12:54:59.237557 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:55:13 crc kubenswrapper[4958]: I1201 12:55:13.813482 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:55:13 crc kubenswrapper[4958]: E1201 12:55:13.814739 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:55:26 crc kubenswrapper[4958]: I1201 12:55:26.797272 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:55:26 crc kubenswrapper[4958]: E1201 12:55:26.798265 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:55:37 crc kubenswrapper[4958]: I1201 12:55:37.798223 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:55:37 crc kubenswrapper[4958]: E1201 12:55:37.799174 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:55:50 crc kubenswrapper[4958]: I1201 12:55:50.797866 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:55:50 crc kubenswrapper[4958]: E1201 12:55:50.798602 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:56:02 crc kubenswrapper[4958]: I1201 12:56:02.798009 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:56:02 crc kubenswrapper[4958]: E1201 12:56:02.799094 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:56:14 crc kubenswrapper[4958]: I1201 12:56:14.797101 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:56:14 crc kubenswrapper[4958]: E1201 12:56:14.797902 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:56:20 crc kubenswrapper[4958]: I1201 12:56:20.702880 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 01 12:56:20 crc kubenswrapper[4958]: I1201 12:56:20.703776 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="ef1fedf1-c218-452e-83b7-f8eb03827765" containerName="adoption" containerID="cri-o://b5ab2cf0b330215648c8e5778f6b7788e2ed8c440232a7d836b821c697f5bf11" gracePeriod=30 Dec 01 12:56:26 crc kubenswrapper[4958]: I1201 12:56:26.798089 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:56:26 crc kubenswrapper[4958]: E1201 12:56:26.798939 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:56:37 crc kubenswrapper[4958]: I1201 12:56:37.797667 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:56:37 crc kubenswrapper[4958]: E1201 12:56:37.798792 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:56:50 crc kubenswrapper[4958]: I1201 12:56:50.817550 4958 generic.go:334] "Generic (PLEG): container finished" podID="ef1fedf1-c218-452e-83b7-f8eb03827765" containerID="b5ab2cf0b330215648c8e5778f6b7788e2ed8c440232a7d836b821c697f5bf11" exitCode=137 Dec 01 12:56:50 crc kubenswrapper[4958]: I1201 12:56:50.817753 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ef1fedf1-c218-452e-83b7-f8eb03827765","Type":"ContainerDied","Data":"b5ab2cf0b330215648c8e5778f6b7788e2ed8c440232a7d836b821c697f5bf11"} Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.346473 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.526502 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") pod \"ef1fedf1-c218-452e-83b7-f8eb03827765\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.527040 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2cwz\" (UniqueName: \"kubernetes.io/projected/ef1fedf1-c218-452e-83b7-f8eb03827765-kube-api-access-c2cwz\") pod \"ef1fedf1-c218-452e-83b7-f8eb03827765\" (UID: \"ef1fedf1-c218-452e-83b7-f8eb03827765\") " Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.535952 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1fedf1-c218-452e-83b7-f8eb03827765-kube-api-access-c2cwz" (OuterVolumeSpecName: "kube-api-access-c2cwz") pod "ef1fedf1-c218-452e-83b7-f8eb03827765" (UID: "ef1fedf1-c218-452e-83b7-f8eb03827765"). InnerVolumeSpecName "kube-api-access-c2cwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.544978 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed" (OuterVolumeSpecName: "mariadb-data") pod "ef1fedf1-c218-452e-83b7-f8eb03827765" (UID: "ef1fedf1-c218-452e-83b7-f8eb03827765"). InnerVolumeSpecName "pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.630888 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2cwz\" (UniqueName: \"kubernetes.io/projected/ef1fedf1-c218-452e-83b7-f8eb03827765-kube-api-access-c2cwz\") on node \"crc\" DevicePath \"\"" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.631026 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") on node \"crc\" " Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.683436 4958 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.683650 4958 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed") on node "crc" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.733642 4958 reconciler_common.go:293] "Volume detached for volume \"pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2676a438-0674-4780-a7f7-e4cbe158a3ed\") on node \"crc\" DevicePath \"\"" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.798310 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:56:51 crc kubenswrapper[4958]: E1201 12:56:51.799072 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.829917 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ef1fedf1-c218-452e-83b7-f8eb03827765","Type":"ContainerDied","Data":"f824534decc60ffb8b1d04b9169f5ff66be47afe425a1fae2e1c92dc671c66d5"} Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.829976 4958 scope.go:117] "RemoveContainer" containerID="b5ab2cf0b330215648c8e5778f6b7788e2ed8c440232a7d836b821c697f5bf11" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.830107 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.864627 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 01 12:56:51 crc kubenswrapper[4958]: I1201 12:56:51.876223 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 01 12:56:52 crc kubenswrapper[4958]: I1201 12:56:52.670786 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 01 12:56:52 crc kubenswrapper[4958]: I1201 12:56:52.671266 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="56957e12-7edf-40ac-accd-bb5f1997e0ab" containerName="adoption" containerID="cri-o://52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb" gracePeriod=30 Dec 01 12:56:53 crc kubenswrapper[4958]: I1201 12:56:53.815740 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1fedf1-c218-452e-83b7-f8eb03827765" path="/var/lib/kubelet/pods/ef1fedf1-c218-452e-83b7-f8eb03827765/volumes" Dec 01 12:57:04 crc kubenswrapper[4958]: I1201 12:57:04.797988 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:57:04 crc kubenswrapper[4958]: E1201 12:57:04.798661 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:57:19 crc kubenswrapper[4958]: I1201 12:57:19.799013 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:57:19 crc kubenswrapper[4958]: E1201 12:57:19.800168 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.322343 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.361059 4958 generic.go:334] "Generic (PLEG): container finished" podID="56957e12-7edf-40ac-accd-bb5f1997e0ab" containerID="52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb" exitCode=137 Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.361092 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.361117 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56957e12-7edf-40ac-accd-bb5f1997e0ab","Type":"ContainerDied","Data":"52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb"} Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.361182 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56957e12-7edf-40ac-accd-bb5f1997e0ab","Type":"ContainerDied","Data":"7aa8b780f15d705b52ea62050fd99bfc8e342daaed08b90dcc7792661aaec969"} Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.361216 4958 scope.go:117] "RemoveContainer" containerID="52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.387835 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") pod \"56957e12-7edf-40ac-accd-bb5f1997e0ab\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.387914 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpz22\" (UniqueName: \"kubernetes.io/projected/56957e12-7edf-40ac-accd-bb5f1997e0ab-kube-api-access-lpz22\") pod \"56957e12-7edf-40ac-accd-bb5f1997e0ab\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.395570 4958 scope.go:117] "RemoveContainer" containerID="52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb" Dec 01 12:57:23 crc kubenswrapper[4958]: E1201 12:57:23.396174 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb\": container with ID starting with 52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb not found: ID does not exist" containerID="52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.396216 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb"} err="failed to get container status \"52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb\": rpc error: code = NotFound desc = could not find container \"52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb\": container with ID starting with 52d7b0bd5baa41e9b5ec3be62efa178bcd91afe6898bc2ba32e70d97a6ec92eb not found: ID does not exist" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.398330 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56957e12-7edf-40ac-accd-bb5f1997e0ab-kube-api-access-lpz22" (OuterVolumeSpecName: "kube-api-access-lpz22") pod "56957e12-7edf-40ac-accd-bb5f1997e0ab" (UID: "56957e12-7edf-40ac-accd-bb5f1997e0ab"). InnerVolumeSpecName "kube-api-access-lpz22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.411940 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14" (OuterVolumeSpecName: "ovn-data") pod "56957e12-7edf-40ac-accd-bb5f1997e0ab" (UID: "56957e12-7edf-40ac-accd-bb5f1997e0ab"). InnerVolumeSpecName "pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.491759 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56957e12-7edf-40ac-accd-bb5f1997e0ab-ovn-data-cert\") pod \"56957e12-7edf-40ac-accd-bb5f1997e0ab\" (UID: \"56957e12-7edf-40ac-accd-bb5f1997e0ab\") " Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.493047 4958 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") on node \"crc\" " Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.493146 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpz22\" (UniqueName: \"kubernetes.io/projected/56957e12-7edf-40ac-accd-bb5f1997e0ab-kube-api-access-lpz22\") on node \"crc\" DevicePath \"\"" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.500487 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56957e12-7edf-40ac-accd-bb5f1997e0ab-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "56957e12-7edf-40ac-accd-bb5f1997e0ab" (UID: "56957e12-7edf-40ac-accd-bb5f1997e0ab"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.524821 4958 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.525035 4958 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14") on node "crc" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.595649 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56957e12-7edf-40ac-accd-bb5f1997e0ab-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.595684 4958 reconciler_common.go:293] "Volume detached for volume \"pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5369ab31-22fb-4d2a-9544-e615f94c7a14\") on node \"crc\" DevicePath \"\"" Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.722229 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.736245 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 01 12:57:23 crc kubenswrapper[4958]: I1201 12:57:23.813759 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56957e12-7edf-40ac-accd-bb5f1997e0ab" path="/var/lib/kubelet/pods/56957e12-7edf-40ac-accd-bb5f1997e0ab/volumes" Dec 01 12:57:30 crc kubenswrapper[4958]: I1201 12:57:30.798409 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:57:30 crc kubenswrapper[4958]: E1201 12:57:30.799636 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:57:43 crc kubenswrapper[4958]: I1201 12:57:43.825963 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:57:43 crc kubenswrapper[4958]: E1201 12:57:43.827204 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:57:56 crc kubenswrapper[4958]: I1201 12:57:56.797755 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:57:56 crc kubenswrapper[4958]: E1201 12:57:56.798550 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:58:08 crc kubenswrapper[4958]: I1201 12:58:08.797333 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:58:08 crc kubenswrapper[4958]: E1201 12:58:08.798132 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:58:20 crc kubenswrapper[4958]: I1201 12:58:20.799042 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:58:20 crc kubenswrapper[4958]: E1201 12:58:20.800359 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.680750 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-st52w/must-gather-99vcj"] Dec 01 12:58:30 crc kubenswrapper[4958]: E1201 12:58:30.682122 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="extract-content" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682146 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="extract-content" Dec 01 12:58:30 crc kubenswrapper[4958]: E1201 12:58:30.682206 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="extract-utilities" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682216 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="extract-utilities" Dec 01 12:58:30 crc kubenswrapper[4958]: E1201 12:58:30.682240 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="registry-server" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682249 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="registry-server" Dec 01 12:58:30 crc kubenswrapper[4958]: E1201 12:58:30.682271 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56957e12-7edf-40ac-accd-bb5f1997e0ab" containerName="adoption" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682278 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="56957e12-7edf-40ac-accd-bb5f1997e0ab" containerName="adoption" Dec 01 12:58:30 crc kubenswrapper[4958]: E1201 12:58:30.682293 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1fedf1-c218-452e-83b7-f8eb03827765" containerName="adoption" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682301 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1fedf1-c218-452e-83b7-f8eb03827765" containerName="adoption" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682596 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1fedf1-c218-452e-83b7-f8eb03827765" containerName="adoption" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682618 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="164fd33a-a75c-4343-a362-2853c914b9fe" containerName="registry-server" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.682643 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="56957e12-7edf-40ac-accd-bb5f1997e0ab" containerName="adoption" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.684634 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.687569 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-st52w"/"openshift-service-ca.crt" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.687818 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-st52w"/"default-dockercfg-5rtqs" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.688120 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-st52w"/"kube-root-ca.crt" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.710910 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-st52w/must-gather-99vcj"] Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.847716 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/952cd42a-be46-45c1-93d6-f5dc8469cf34-must-gather-output\") pod \"must-gather-99vcj\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.848670 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrgn2\" (UniqueName: \"kubernetes.io/projected/952cd42a-be46-45c1-93d6-f5dc8469cf34-kube-api-access-xrgn2\") pod \"must-gather-99vcj\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.950301 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrgn2\" (UniqueName: \"kubernetes.io/projected/952cd42a-be46-45c1-93d6-f5dc8469cf34-kube-api-access-xrgn2\") pod \"must-gather-99vcj\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.950398 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/952cd42a-be46-45c1-93d6-f5dc8469cf34-must-gather-output\") pod \"must-gather-99vcj\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.950865 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/952cd42a-be46-45c1-93d6-f5dc8469cf34-must-gather-output\") pod \"must-gather-99vcj\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:30 crc kubenswrapper[4958]: I1201 12:58:30.974243 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrgn2\" (UniqueName: \"kubernetes.io/projected/952cd42a-be46-45c1-93d6-f5dc8469cf34-kube-api-access-xrgn2\") pod \"must-gather-99vcj\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:31 crc kubenswrapper[4958]: I1201 12:58:31.034911 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 12:58:31 crc kubenswrapper[4958]: I1201 12:58:31.534146 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-st52w/must-gather-99vcj"] Dec 01 12:58:32 crc kubenswrapper[4958]: I1201 12:58:32.350167 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/must-gather-99vcj" event={"ID":"952cd42a-be46-45c1-93d6-f5dc8469cf34","Type":"ContainerStarted","Data":"dabd862e11ed7f0b3ba0141abe0330ed2dd510aeced3bba207abb9e02bbdb946"} Dec 01 12:58:32 crc kubenswrapper[4958]: I1201 12:58:32.797764 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:58:32 crc kubenswrapper[4958]: E1201 12:58:32.798398 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:58:37 crc kubenswrapper[4958]: I1201 12:58:37.411253 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/must-gather-99vcj" event={"ID":"952cd42a-be46-45c1-93d6-f5dc8469cf34","Type":"ContainerStarted","Data":"719a64633842a588c867d8e512ad32bb91fb23f058cb79134ff949b8780f0ebf"} Dec 01 12:58:37 crc kubenswrapper[4958]: I1201 12:58:37.411872 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/must-gather-99vcj" event={"ID":"952cd42a-be46-45c1-93d6-f5dc8469cf34","Type":"ContainerStarted","Data":"586e5f3ecba9851f074851b86405cc99c828f89689625426fc5d707449707117"} Dec 01 12:58:37 crc kubenswrapper[4958]: I1201 12:58:37.439439 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-st52w/must-gather-99vcj" podStartSLOduration=2.806362776 podStartE2EDuration="7.439410092s" podCreationTimestamp="2025-12-01 12:58:30 +0000 UTC" firstStartedPulling="2025-12-01 12:58:31.534000533 +0000 UTC m=+10759.042789570" lastFinishedPulling="2025-12-01 12:58:36.167047839 +0000 UTC m=+10763.675836886" observedRunningTime="2025-12-01 12:58:37.42548273 +0000 UTC m=+10764.934271797" watchObservedRunningTime="2025-12-01 12:58:37.439410092 +0000 UTC m=+10764.948199149" Dec 01 12:58:39 crc kubenswrapper[4958]: E1201 12:58:39.505198 4958 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.216:46482->38.129.56.216:34693: write tcp 38.129.56.216:46482->38.129.56.216:34693: write: broken pipe Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.130457 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-st52w/crc-debug-rrdw2"] Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.133217 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.335084 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvnp\" (UniqueName: \"kubernetes.io/projected/a3974759-ecbe-4833-ac5e-2d417f74cfd9-kube-api-access-8lvnp\") pod \"crc-debug-rrdw2\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.335412 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3974759-ecbe-4833-ac5e-2d417f74cfd9-host\") pod \"crc-debug-rrdw2\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.437334 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3974759-ecbe-4833-ac5e-2d417f74cfd9-host\") pod \"crc-debug-rrdw2\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.437535 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvnp\" (UniqueName: \"kubernetes.io/projected/a3974759-ecbe-4833-ac5e-2d417f74cfd9-kube-api-access-8lvnp\") pod \"crc-debug-rrdw2\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.437707 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3974759-ecbe-4833-ac5e-2d417f74cfd9-host\") pod \"crc-debug-rrdw2\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.459529 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvnp\" (UniqueName: \"kubernetes.io/projected/a3974759-ecbe-4833-ac5e-2d417f74cfd9-kube-api-access-8lvnp\") pod \"crc-debug-rrdw2\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:41 crc kubenswrapper[4958]: I1201 12:58:41.757293 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:58:42 crc kubenswrapper[4958]: I1201 12:58:42.492874 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/crc-debug-rrdw2" event={"ID":"a3974759-ecbe-4833-ac5e-2d417f74cfd9","Type":"ContainerStarted","Data":"5b29e099b0299117aeb14ae16432ae0bd291fd7f60ad1dbd061ab4f8ea516428"} Dec 01 12:58:47 crc kubenswrapper[4958]: I1201 12:58:47.798811 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:58:47 crc kubenswrapper[4958]: E1201 12:58:47.800338 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:58:58 crc kubenswrapper[4958]: I1201 12:58:58.315299 4958 trace.go:236] Trace[1690253078]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-1" (01-Dec-2025 12:58:49.204) (total time: 9110ms): Dec 01 12:58:58 crc kubenswrapper[4958]: Trace[1690253078]: [9.110467126s] [9.110467126s] END Dec 01 12:58:58 crc kubenswrapper[4958]: I1201 12:58:58.320811 4958 trace.go:236] Trace[1534549314]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (01-Dec-2025 12:58:55.470) (total time: 2850ms): Dec 01 12:58:58 crc kubenswrapper[4958]: Trace[1534549314]: [2.850246719s] [2.850246719s] END Dec 01 12:58:58 crc kubenswrapper[4958]: I1201 12:58:58.797653 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:58:58 crc kubenswrapper[4958]: E1201 12:58:58.798401 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:59:04 crc kubenswrapper[4958]: E1201 12:59:04.338197 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 01 12:59:04 crc kubenswrapper[4958]: E1201 12:59:04.339429 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lvnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-rrdw2_openshift-must-gather-st52w(a3974759-ecbe-4833-ac5e-2d417f74cfd9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 12:59:04 crc kubenswrapper[4958]: E1201 12:59:04.340638 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-st52w/crc-debug-rrdw2" podUID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" Dec 01 12:59:04 crc kubenswrapper[4958]: E1201 12:59:04.924103 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-st52w/crc-debug-rrdw2" podUID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" Dec 01 12:59:10 crc kubenswrapper[4958]: I1201 12:59:10.797772 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:59:10 crc kubenswrapper[4958]: E1201 12:59:10.798730 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:59:18 crc kubenswrapper[4958]: I1201 12:59:18.073724 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/crc-debug-rrdw2" event={"ID":"a3974759-ecbe-4833-ac5e-2d417f74cfd9","Type":"ContainerStarted","Data":"b936085f60b23773d8dca94b49ebfdbaa902ba5e5ecfc96dfc8960a529ad97b8"} Dec 01 12:59:18 crc kubenswrapper[4958]: I1201 12:59:18.098439 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-st52w/crc-debug-rrdw2" podStartSLOduration=1.564005462 podStartE2EDuration="37.098411702s" podCreationTimestamp="2025-12-01 12:58:41 +0000 UTC" firstStartedPulling="2025-12-01 12:58:41.800013906 +0000 UTC m=+10769.308802943" lastFinishedPulling="2025-12-01 12:59:17.334420146 +0000 UTC m=+10804.843209183" observedRunningTime="2025-12-01 12:59:18.098184706 +0000 UTC m=+10805.606973753" watchObservedRunningTime="2025-12-01 12:59:18.098411702 +0000 UTC m=+10805.607200729" Dec 01 12:59:23 crc kubenswrapper[4958]: I1201 12:59:23.807002 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:59:23 crc kubenswrapper[4958]: E1201 12:59:23.808019 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:59:35 crc kubenswrapper[4958]: I1201 12:59:35.591256 4958 generic.go:334] "Generic (PLEG): container finished" podID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" containerID="b936085f60b23773d8dca94b49ebfdbaa902ba5e5ecfc96dfc8960a529ad97b8" exitCode=0 Dec 01 12:59:35 crc kubenswrapper[4958]: I1201 12:59:35.591522 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/crc-debug-rrdw2" event={"ID":"a3974759-ecbe-4833-ac5e-2d417f74cfd9","Type":"ContainerDied","Data":"b936085f60b23773d8dca94b49ebfdbaa902ba5e5ecfc96dfc8960a529ad97b8"} Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.740177 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.787990 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-st52w/crc-debug-rrdw2"] Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.806803 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-st52w/crc-debug-rrdw2"] Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.862886 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lvnp\" (UniqueName: \"kubernetes.io/projected/a3974759-ecbe-4833-ac5e-2d417f74cfd9-kube-api-access-8lvnp\") pod \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.863214 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3974759-ecbe-4833-ac5e-2d417f74cfd9-host\") pod \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\" (UID: \"a3974759-ecbe-4833-ac5e-2d417f74cfd9\") " Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.863287 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3974759-ecbe-4833-ac5e-2d417f74cfd9-host" (OuterVolumeSpecName: "host") pod "a3974759-ecbe-4833-ac5e-2d417f74cfd9" (UID: "a3974759-ecbe-4833-ac5e-2d417f74cfd9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.863654 4958 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3974759-ecbe-4833-ac5e-2d417f74cfd9-host\") on node \"crc\" DevicePath \"\"" Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.870165 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3974759-ecbe-4833-ac5e-2d417f74cfd9-kube-api-access-8lvnp" (OuterVolumeSpecName: "kube-api-access-8lvnp") pod "a3974759-ecbe-4833-ac5e-2d417f74cfd9" (UID: "a3974759-ecbe-4833-ac5e-2d417f74cfd9"). InnerVolumeSpecName "kube-api-access-8lvnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:59:36 crc kubenswrapper[4958]: I1201 12:59:36.966076 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lvnp\" (UniqueName: \"kubernetes.io/projected/a3974759-ecbe-4833-ac5e-2d417f74cfd9-kube-api-access-8lvnp\") on node \"crc\" DevicePath \"\"" Dec 01 12:59:37 crc kubenswrapper[4958]: I1201 12:59:37.613097 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b29e099b0299117aeb14ae16432ae0bd291fd7f60ad1dbd061ab4f8ea516428" Dec 01 12:59:37 crc kubenswrapper[4958]: I1201 12:59:37.613169 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-rrdw2" Dec 01 12:59:37 crc kubenswrapper[4958]: I1201 12:59:37.799095 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:59:37 crc kubenswrapper[4958]: E1201 12:59:37.799649 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 12:59:37 crc kubenswrapper[4958]: I1201 12:59:37.818527 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" path="/var/lib/kubelet/pods/a3974759-ecbe-4833-ac5e-2d417f74cfd9/volumes" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.038795 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-st52w/crc-debug-xw4lf"] Dec 01 12:59:38 crc kubenswrapper[4958]: E1201 12:59:38.039279 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" containerName="container-00" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.039296 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" containerName="container-00" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.039540 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3974759-ecbe-4833-ac5e-2d417f74cfd9" containerName="container-00" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.040343 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.240759 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5fdh\" (UniqueName: \"kubernetes.io/projected/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-kube-api-access-j5fdh\") pod \"crc-debug-xw4lf\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.241285 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-host\") pod \"crc-debug-xw4lf\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.343794 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5fdh\" (UniqueName: \"kubernetes.io/projected/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-kube-api-access-j5fdh\") pod \"crc-debug-xw4lf\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.343977 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-host\") pod \"crc-debug-xw4lf\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.344205 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-host\") pod \"crc-debug-xw4lf\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.378612 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5fdh\" (UniqueName: \"kubernetes.io/projected/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-kube-api-access-j5fdh\") pod \"crc-debug-xw4lf\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: I1201 12:59:38.666530 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:38 crc kubenswrapper[4958]: W1201 12:59:38.710443 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5106adc_b6cb_4d7a_a8af_732fbcc9eb70.slice/crio-bf1938285a9bd346b9af7d157038c01c73e6dadfa3eaa740747ec58740710aa6 WatchSource:0}: Error finding container bf1938285a9bd346b9af7d157038c01c73e6dadfa3eaa740747ec58740710aa6: Status 404 returned error can't find the container with id bf1938285a9bd346b9af7d157038c01c73e6dadfa3eaa740747ec58740710aa6 Dec 01 12:59:39 crc kubenswrapper[4958]: I1201 12:59:39.643331 4958 generic.go:334] "Generic (PLEG): container finished" podID="f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" containerID="65e3bf89e1bf75a6e489d57b253269949291320626b8da5a69bd4cf9aa704bea" exitCode=1 Dec 01 12:59:39 crc kubenswrapper[4958]: I1201 12:59:39.643564 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/crc-debug-xw4lf" event={"ID":"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70","Type":"ContainerDied","Data":"65e3bf89e1bf75a6e489d57b253269949291320626b8da5a69bd4cf9aa704bea"} Dec 01 12:59:39 crc kubenswrapper[4958]: I1201 12:59:39.643661 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/crc-debug-xw4lf" event={"ID":"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70","Type":"ContainerStarted","Data":"bf1938285a9bd346b9af7d157038c01c73e6dadfa3eaa740747ec58740710aa6"} Dec 01 12:59:39 crc kubenswrapper[4958]: I1201 12:59:39.701170 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-st52w/crc-debug-xw4lf"] Dec 01 12:59:39 crc kubenswrapper[4958]: I1201 12:59:39.714000 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-st52w/crc-debug-xw4lf"] Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.777619 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.816926 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-host\") pod \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.816995 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5fdh\" (UniqueName: \"kubernetes.io/projected/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-kube-api-access-j5fdh\") pod \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\" (UID: \"f5106adc-b6cb-4d7a-a8af-732fbcc9eb70\") " Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.817048 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-host" (OuterVolumeSpecName: "host") pod "f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" (UID: "f5106adc-b6cb-4d7a-a8af-732fbcc9eb70"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.817438 4958 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-host\") on node \"crc\" DevicePath \"\"" Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.824069 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-kube-api-access-j5fdh" (OuterVolumeSpecName: "kube-api-access-j5fdh") pod "f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" (UID: "f5106adc-b6cb-4d7a-a8af-732fbcc9eb70"). InnerVolumeSpecName "kube-api-access-j5fdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 12:59:40 crc kubenswrapper[4958]: I1201 12:59:40.919124 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5fdh\" (UniqueName: \"kubernetes.io/projected/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70-kube-api-access-j5fdh\") on node \"crc\" DevicePath \"\"" Dec 01 12:59:41 crc kubenswrapper[4958]: I1201 12:59:41.670778 4958 scope.go:117] "RemoveContainer" containerID="65e3bf89e1bf75a6e489d57b253269949291320626b8da5a69bd4cf9aa704bea" Dec 01 12:59:41 crc kubenswrapper[4958]: I1201 12:59:41.671525 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/crc-debug-xw4lf" Dec 01 12:59:41 crc kubenswrapper[4958]: I1201 12:59:41.812531 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" path="/var/lib/kubelet/pods/f5106adc-b6cb-4d7a-a8af-732fbcc9eb70/volumes" Dec 01 12:59:51 crc kubenswrapper[4958]: I1201 12:59:51.798423 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 12:59:51 crc kubenswrapper[4958]: E1201 12:59:51.799675 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.168611 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57"] Dec 01 13:00:00 crc kubenswrapper[4958]: E1201 13:00:00.169805 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" containerName="container-00" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.169824 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" containerName="container-00" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.170117 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5106adc-b6cb-4d7a-a8af-732fbcc9eb70" containerName="container-00" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.171093 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.173970 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.174078 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.186315 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57"] Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.230723 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4772565-60fa-4417-9397-1d99fd767026-config-volume\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.230776 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7hfn\" (UniqueName: \"kubernetes.io/projected/c4772565-60fa-4417-9397-1d99fd767026-kube-api-access-f7hfn\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.230943 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4772565-60fa-4417-9397-1d99fd767026-secret-volume\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.333702 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4772565-60fa-4417-9397-1d99fd767026-secret-volume\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.333900 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4772565-60fa-4417-9397-1d99fd767026-config-volume\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.333932 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7hfn\" (UniqueName: \"kubernetes.io/projected/c4772565-60fa-4417-9397-1d99fd767026-kube-api-access-f7hfn\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.335184 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4772565-60fa-4417-9397-1d99fd767026-config-volume\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.346718 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4772565-60fa-4417-9397-1d99fd767026-secret-volume\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.355655 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7hfn\" (UniqueName: \"kubernetes.io/projected/c4772565-60fa-4417-9397-1d99fd767026-kube-api-access-f7hfn\") pod \"collect-profiles-29409900-zgq57\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:00 crc kubenswrapper[4958]: I1201 13:00:00.502866 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:01 crc kubenswrapper[4958]: I1201 13:00:01.087009 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57"] Dec 01 13:00:01 crc kubenswrapper[4958]: E1201 13:00:01.755740 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4772565_60fa_4417_9397_1d99fd767026.slice/crio-conmon-2c171eb68dafe086596ffc96ab32f2a4223c8379c87764298c8032d4f639b2f9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4772565_60fa_4417_9397_1d99fd767026.slice/crio-2c171eb68dafe086596ffc96ab32f2a4223c8379c87764298c8032d4f639b2f9.scope\": RecentStats: unable to find data in memory cache]" Dec 01 13:00:01 crc kubenswrapper[4958]: I1201 13:00:01.923175 4958 generic.go:334] "Generic (PLEG): container finished" podID="c4772565-60fa-4417-9397-1d99fd767026" containerID="2c171eb68dafe086596ffc96ab32f2a4223c8379c87764298c8032d4f639b2f9" exitCode=0 Dec 01 13:00:01 crc kubenswrapper[4958]: I1201 13:00:01.923277 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" event={"ID":"c4772565-60fa-4417-9397-1d99fd767026","Type":"ContainerDied","Data":"2c171eb68dafe086596ffc96ab32f2a4223c8379c87764298c8032d4f639b2f9"} Dec 01 13:00:01 crc kubenswrapper[4958]: I1201 13:00:01.923600 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" event={"ID":"c4772565-60fa-4417-9397-1d99fd767026","Type":"ContainerStarted","Data":"609195a159935faf6967d1aea8568b769e999bfb76914c443040295331b018d5"} Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.370869 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.560174 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4772565-60fa-4417-9397-1d99fd767026-config-volume\") pod \"c4772565-60fa-4417-9397-1d99fd767026\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.560243 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4772565-60fa-4417-9397-1d99fd767026-secret-volume\") pod \"c4772565-60fa-4417-9397-1d99fd767026\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.560354 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7hfn\" (UniqueName: \"kubernetes.io/projected/c4772565-60fa-4417-9397-1d99fd767026-kube-api-access-f7hfn\") pod \"c4772565-60fa-4417-9397-1d99fd767026\" (UID: \"c4772565-60fa-4417-9397-1d99fd767026\") " Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.561733 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4772565-60fa-4417-9397-1d99fd767026-config-volume" (OuterVolumeSpecName: "config-volume") pod "c4772565-60fa-4417-9397-1d99fd767026" (UID: "c4772565-60fa-4417-9397-1d99fd767026"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.568080 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4772565-60fa-4417-9397-1d99fd767026-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c4772565-60fa-4417-9397-1d99fd767026" (UID: "c4772565-60fa-4417-9397-1d99fd767026"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.569077 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4772565-60fa-4417-9397-1d99fd767026-kube-api-access-f7hfn" (OuterVolumeSpecName: "kube-api-access-f7hfn") pod "c4772565-60fa-4417-9397-1d99fd767026" (UID: "c4772565-60fa-4417-9397-1d99fd767026"). InnerVolumeSpecName "kube-api-access-f7hfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.663953 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7hfn\" (UniqueName: \"kubernetes.io/projected/c4772565-60fa-4417-9397-1d99fd767026-kube-api-access-f7hfn\") on node \"crc\" DevicePath \"\"" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.664029 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4772565-60fa-4417-9397-1d99fd767026-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.664044 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4772565-60fa-4417-9397-1d99fd767026-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.805915 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.951546 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" event={"ID":"c4772565-60fa-4417-9397-1d99fd767026","Type":"ContainerDied","Data":"609195a159935faf6967d1aea8568b769e999bfb76914c443040295331b018d5"} Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.951751 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="609195a159935faf6967d1aea8568b769e999bfb76914c443040295331b018d5" Dec 01 13:00:03 crc kubenswrapper[4958]: I1201 13:00:03.951579 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409900-zgq57" Dec 01 13:00:04 crc kubenswrapper[4958]: I1201 13:00:04.478075 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g"] Dec 01 13:00:04 crc kubenswrapper[4958]: I1201 13:00:04.487405 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409855-mhv9g"] Dec 01 13:00:04 crc kubenswrapper[4958]: I1201 13:00:04.966624 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"2cbdb8c5e4aa7ce2d4300154efb15fbdf8680bee6c2d6407e3cd25b114c4e0dd"} Dec 01 13:00:05 crc kubenswrapper[4958]: I1201 13:00:05.809475 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b4f776-f89f-4782-a998-e271d7f657f0" path="/var/lib/kubelet/pods/d0b4f776-f89f-4782-a998-e271d7f657f0/volumes" Dec 01 13:00:05 crc kubenswrapper[4958]: I1201 13:00:05.971923 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cpxr8"] Dec 01 13:00:05 crc kubenswrapper[4958]: E1201 13:00:05.977008 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4772565-60fa-4417-9397-1d99fd767026" containerName="collect-profiles" Dec 01 13:00:05 crc kubenswrapper[4958]: I1201 13:00:05.977046 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4772565-60fa-4417-9397-1d99fd767026" containerName="collect-profiles" Dec 01 13:00:05 crc kubenswrapper[4958]: I1201 13:00:05.977441 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4772565-60fa-4417-9397-1d99fd767026" containerName="collect-profiles" Dec 01 13:00:05 crc kubenswrapper[4958]: I1201 13:00:05.979083 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:05 crc kubenswrapper[4958]: I1201 13:00:05.998502 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpxr8"] Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.050482 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzpt\" (UniqueName: \"kubernetes.io/projected/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-kube-api-access-gmzpt\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.050642 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-utilities\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.050673 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-catalog-content\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.153529 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzpt\" (UniqueName: \"kubernetes.io/projected/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-kube-api-access-gmzpt\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.153953 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-utilities\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.153989 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-catalog-content\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.154467 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-utilities\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.154523 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-catalog-content\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.177685 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzpt\" (UniqueName: \"kubernetes.io/projected/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-kube-api-access-gmzpt\") pod \"redhat-operators-cpxr8\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.304817 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.874316 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpxr8"] Dec 01 13:00:06 crc kubenswrapper[4958]: I1201 13:00:06.987735 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerStarted","Data":"4e9ebf1e0f47bcef5b9e7677811ae531054c25f0ca7184f6e0cb03216f6504f1"} Dec 01 13:00:08 crc kubenswrapper[4958]: I1201 13:00:08.005090 4958 generic.go:334] "Generic (PLEG): container finished" podID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerID="920792ad7473d94676b06bb7a505c229e2bb3bd750bac1d80a8619330eb2d22e" exitCode=0 Dec 01 13:00:08 crc kubenswrapper[4958]: I1201 13:00:08.005181 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerDied","Data":"920792ad7473d94676b06bb7a505c229e2bb3bd750bac1d80a8619330eb2d22e"} Dec 01 13:00:08 crc kubenswrapper[4958]: I1201 13:00:08.008431 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 13:00:10 crc kubenswrapper[4958]: I1201 13:00:10.031709 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerStarted","Data":"66bead030f7ce6fe70ccd5796800f4a50191aca47175c02ed7d3b705ad8df950"} Dec 01 13:00:12 crc kubenswrapper[4958]: I1201 13:00:12.054023 4958 generic.go:334] "Generic (PLEG): container finished" podID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerID="66bead030f7ce6fe70ccd5796800f4a50191aca47175c02ed7d3b705ad8df950" exitCode=0 Dec 01 13:00:12 crc kubenswrapper[4958]: I1201 13:00:12.054103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerDied","Data":"66bead030f7ce6fe70ccd5796800f4a50191aca47175c02ed7d3b705ad8df950"} Dec 01 13:00:14 crc kubenswrapper[4958]: I1201 13:00:14.081682 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerStarted","Data":"9cb12451bbe00e794be2b4c3750ce307064e6252228171b6dbb0ad358c4daec4"} Dec 01 13:00:14 crc kubenswrapper[4958]: I1201 13:00:14.115334 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cpxr8" podStartSLOduration=4.596361319 podStartE2EDuration="9.115301406s" podCreationTimestamp="2025-12-01 13:00:05 +0000 UTC" firstStartedPulling="2025-12-01 13:00:08.007934818 +0000 UTC m=+10855.516723885" lastFinishedPulling="2025-12-01 13:00:12.526874895 +0000 UTC m=+10860.035663972" observedRunningTime="2025-12-01 13:00:14.102355742 +0000 UTC m=+10861.611144819" watchObservedRunningTime="2025-12-01 13:00:14.115301406 +0000 UTC m=+10861.624090493" Dec 01 13:00:16 crc kubenswrapper[4958]: I1201 13:00:16.305832 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:16 crc kubenswrapper[4958]: I1201 13:00:16.306584 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:17 crc kubenswrapper[4958]: I1201 13:00:17.366973 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cpxr8" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="registry-server" probeResult="failure" output=< Dec 01 13:00:17 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Dec 01 13:00:17 crc kubenswrapper[4958]: > Dec 01 13:00:26 crc kubenswrapper[4958]: I1201 13:00:26.750020 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:26 crc kubenswrapper[4958]: I1201 13:00:26.855188 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:27 crc kubenswrapper[4958]: I1201 13:00:27.009653 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpxr8"] Dec 01 13:00:28 crc kubenswrapper[4958]: I1201 13:00:28.403340 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cpxr8" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="registry-server" containerID="cri-o://9cb12451bbe00e794be2b4c3750ce307064e6252228171b6dbb0ad358c4daec4" gracePeriod=2 Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.432562 4958 generic.go:334] "Generic (PLEG): container finished" podID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerID="9cb12451bbe00e794be2b4c3750ce307064e6252228171b6dbb0ad358c4daec4" exitCode=0 Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.432792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerDied","Data":"9cb12451bbe00e794be2b4c3750ce307064e6252228171b6dbb0ad358c4daec4"} Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.620509 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.672593 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-catalog-content\") pod \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.672758 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmzpt\" (UniqueName: \"kubernetes.io/projected/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-kube-api-access-gmzpt\") pod \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.672833 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-utilities\") pod \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\" (UID: \"6f9ae7d6-a02f-4862-a482-aa353f88fa9d\") " Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.673918 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-utilities" (OuterVolumeSpecName: "utilities") pod "6f9ae7d6-a02f-4862-a482-aa353f88fa9d" (UID: "6f9ae7d6-a02f-4862-a482-aa353f88fa9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.695091 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-kube-api-access-gmzpt" (OuterVolumeSpecName: "kube-api-access-gmzpt") pod "6f9ae7d6-a02f-4862-a482-aa353f88fa9d" (UID: "6f9ae7d6-a02f-4862-a482-aa353f88fa9d"). InnerVolumeSpecName "kube-api-access-gmzpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.777391 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.777433 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmzpt\" (UniqueName: \"kubernetes.io/projected/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-kube-api-access-gmzpt\") on node \"crc\" DevicePath \"\"" Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.811605 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f9ae7d6-a02f-4862-a482-aa353f88fa9d" (UID: "6f9ae7d6-a02f-4862-a482-aa353f88fa9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:00:29 crc kubenswrapper[4958]: I1201 13:00:29.879990 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9ae7d6-a02f-4862-a482-aa353f88fa9d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.448365 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpxr8" event={"ID":"6f9ae7d6-a02f-4862-a482-aa353f88fa9d","Type":"ContainerDied","Data":"4e9ebf1e0f47bcef5b9e7677811ae531054c25f0ca7184f6e0cb03216f6504f1"} Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.448479 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpxr8" Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.448831 4958 scope.go:117] "RemoveContainer" containerID="9cb12451bbe00e794be2b4c3750ce307064e6252228171b6dbb0ad358c4daec4" Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.493713 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpxr8"] Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.506903 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cpxr8"] Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.509606 4958 scope.go:117] "RemoveContainer" containerID="66bead030f7ce6fe70ccd5796800f4a50191aca47175c02ed7d3b705ad8df950" Dec 01 13:00:30 crc kubenswrapper[4958]: I1201 13:00:30.538261 4958 scope.go:117] "RemoveContainer" containerID="920792ad7473d94676b06bb7a505c229e2bb3bd750bac1d80a8619330eb2d22e" Dec 01 13:00:31 crc kubenswrapper[4958]: I1201 13:00:31.821183 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" path="/var/lib/kubelet/pods/6f9ae7d6-a02f-4862-a482-aa353f88fa9d/volumes" Dec 01 13:00:33 crc kubenswrapper[4958]: I1201 13:00:33.954436 4958 scope.go:117] "RemoveContainer" containerID="b0e999efb6d32e294a6209c3f968abcd7e946a7e95a3fefac68948978dba083d" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.178931 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409901-rf82w"] Dec 01 13:01:00 crc kubenswrapper[4958]: E1201 13:01:00.180183 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="extract-utilities" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.180216 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="extract-utilities" Dec 01 13:01:00 crc kubenswrapper[4958]: E1201 13:01:00.180242 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="registry-server" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.180250 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="registry-server" Dec 01 13:01:00 crc kubenswrapper[4958]: E1201 13:01:00.181217 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="extract-content" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.181242 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="extract-content" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.181757 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ae7d6-a02f-4862-a482-aa353f88fa9d" containerName="registry-server" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.183906 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.210461 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409901-rf82w"] Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.289027 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptbcp\" (UniqueName: \"kubernetes.io/projected/2f2a8511-37e1-4f3a-9eba-b950147efe75-kube-api-access-ptbcp\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.289309 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-config-data\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.289348 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-combined-ca-bundle\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.289382 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-fernet-keys\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.391255 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-config-data\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.391302 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-combined-ca-bundle\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.391325 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-fernet-keys\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.391377 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptbcp\" (UniqueName: \"kubernetes.io/projected/2f2a8511-37e1-4f3a-9eba-b950147efe75-kube-api-access-ptbcp\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.398641 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-fernet-keys\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.399616 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-combined-ca-bundle\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.403283 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-config-data\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.418719 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptbcp\" (UniqueName: \"kubernetes.io/projected/2f2a8511-37e1-4f3a-9eba-b950147efe75-kube-api-access-ptbcp\") pod \"keystone-cron-29409901-rf82w\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:00 crc kubenswrapper[4958]: I1201 13:01:00.508028 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:01 crc kubenswrapper[4958]: I1201 13:01:01.090374 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409901-rf82w"] Dec 01 13:01:02 crc kubenswrapper[4958]: I1201 13:01:02.002118 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409901-rf82w" event={"ID":"2f2a8511-37e1-4f3a-9eba-b950147efe75","Type":"ContainerStarted","Data":"a2678cd880fce4d87c21ff5731a078d806fb73b1418a8a7b717896fc9ec85f33"} Dec 01 13:01:02 crc kubenswrapper[4958]: I1201 13:01:02.002569 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409901-rf82w" event={"ID":"2f2a8511-37e1-4f3a-9eba-b950147efe75","Type":"ContainerStarted","Data":"945d8cdb1924d273af9a92ae552e16cf5875f7d4f869812789a08e99e18f8632"} Dec 01 13:01:02 crc kubenswrapper[4958]: I1201 13:01:02.033340 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409901-rf82w" podStartSLOduration=2.033320593 podStartE2EDuration="2.033320593s" podCreationTimestamp="2025-12-01 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:01:02.023661211 +0000 UTC m=+10909.532450248" watchObservedRunningTime="2025-12-01 13:01:02.033320593 +0000 UTC m=+10909.542109630" Dec 01 13:01:04 crc kubenswrapper[4958]: I1201 13:01:04.027281 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f2a8511-37e1-4f3a-9eba-b950147efe75" containerID="a2678cd880fce4d87c21ff5731a078d806fb73b1418a8a7b717896fc9ec85f33" exitCode=0 Dec 01 13:01:04 crc kubenswrapper[4958]: I1201 13:01:04.027414 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409901-rf82w" event={"ID":"2f2a8511-37e1-4f3a-9eba-b950147efe75","Type":"ContainerDied","Data":"a2678cd880fce4d87c21ff5731a078d806fb73b1418a8a7b717896fc9ec85f33"} Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.510569 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.677667 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptbcp\" (UniqueName: \"kubernetes.io/projected/2f2a8511-37e1-4f3a-9eba-b950147efe75-kube-api-access-ptbcp\") pod \"2f2a8511-37e1-4f3a-9eba-b950147efe75\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.677900 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-fernet-keys\") pod \"2f2a8511-37e1-4f3a-9eba-b950147efe75\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.677919 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-combined-ca-bundle\") pod \"2f2a8511-37e1-4f3a-9eba-b950147efe75\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.677946 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-config-data\") pod \"2f2a8511-37e1-4f3a-9eba-b950147efe75\" (UID: \"2f2a8511-37e1-4f3a-9eba-b950147efe75\") " Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.692337 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2f2a8511-37e1-4f3a-9eba-b950147efe75" (UID: "2f2a8511-37e1-4f3a-9eba-b950147efe75"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.704786 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2a8511-37e1-4f3a-9eba-b950147efe75-kube-api-access-ptbcp" (OuterVolumeSpecName: "kube-api-access-ptbcp") pod "2f2a8511-37e1-4f3a-9eba-b950147efe75" (UID: "2f2a8511-37e1-4f3a-9eba-b950147efe75"). InnerVolumeSpecName "kube-api-access-ptbcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.753000 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f2a8511-37e1-4f3a-9eba-b950147efe75" (UID: "2f2a8511-37e1-4f3a-9eba-b950147efe75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.781457 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptbcp\" (UniqueName: \"kubernetes.io/projected/2f2a8511-37e1-4f3a-9eba-b950147efe75-kube-api-access-ptbcp\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.781502 4958 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.781513 4958 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.876026 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-config-data" (OuterVolumeSpecName: "config-data") pod "2f2a8511-37e1-4f3a-9eba-b950147efe75" (UID: "2f2a8511-37e1-4f3a-9eba-b950147efe75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:01:05 crc kubenswrapper[4958]: I1201 13:01:05.882995 4958 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2a8511-37e1-4f3a-9eba-b950147efe75-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:06 crc kubenswrapper[4958]: I1201 13:01:06.047159 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409901-rf82w" event={"ID":"2f2a8511-37e1-4f3a-9eba-b950147efe75","Type":"ContainerDied","Data":"945d8cdb1924d273af9a92ae552e16cf5875f7d4f869812789a08e99e18f8632"} Dec 01 13:01:06 crc kubenswrapper[4958]: I1201 13:01:06.047208 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409901-rf82w" Dec 01 13:01:06 crc kubenswrapper[4958]: I1201 13:01:06.047217 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945d8cdb1924d273af9a92ae552e16cf5875f7d4f869812789a08e99e18f8632" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.357824 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dc8zs"] Dec 01 13:01:10 crc kubenswrapper[4958]: E1201 13:01:10.359838 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2a8511-37e1-4f3a-9eba-b950147efe75" containerName="keystone-cron" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.359960 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2a8511-37e1-4f3a-9eba-b950147efe75" containerName="keystone-cron" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.360236 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2a8511-37e1-4f3a-9eba-b950147efe75" containerName="keystone-cron" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.365301 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.523291 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc8zs"] Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.630094 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-catalog-content\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.630447 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-utilities\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.630484 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzlm9\" (UniqueName: \"kubernetes.io/projected/9f6772b6-fcad-42e2-8473-416359809312-kube-api-access-mzlm9\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.732093 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-catalog-content\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.732165 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-utilities\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.732203 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzlm9\" (UniqueName: \"kubernetes.io/projected/9f6772b6-fcad-42e2-8473-416359809312-kube-api-access-mzlm9\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.732697 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-utilities\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.732764 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-catalog-content\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.757950 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzlm9\" (UniqueName: \"kubernetes.io/projected/9f6772b6-fcad-42e2-8473-416359809312-kube-api-access-mzlm9\") pod \"certified-operators-dc8zs\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:10 crc kubenswrapper[4958]: I1201 13:01:10.890672 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:11 crc kubenswrapper[4958]: I1201 13:01:11.516371 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc8zs"] Dec 01 13:01:12 crc kubenswrapper[4958]: I1201 13:01:12.132710 4958 generic.go:334] "Generic (PLEG): container finished" podID="9f6772b6-fcad-42e2-8473-416359809312" containerID="a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9" exitCode=0 Dec 01 13:01:12 crc kubenswrapper[4958]: I1201 13:01:12.132776 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerDied","Data":"a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9"} Dec 01 13:01:12 crc kubenswrapper[4958]: I1201 13:01:12.132966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerStarted","Data":"1baf49babca88319135c37170f5c0f42902a49271417406e2e95c93c02484452"} Dec 01 13:01:14 crc kubenswrapper[4958]: I1201 13:01:14.350166 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerStarted","Data":"5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4"} Dec 01 13:01:15 crc kubenswrapper[4958]: I1201 13:01:15.363621 4958 generic.go:334] "Generic (PLEG): container finished" podID="9f6772b6-fcad-42e2-8473-416359809312" containerID="5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4" exitCode=0 Dec 01 13:01:15 crc kubenswrapper[4958]: I1201 13:01:15.363902 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerDied","Data":"5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4"} Dec 01 13:01:16 crc kubenswrapper[4958]: I1201 13:01:16.379642 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerStarted","Data":"c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10"} Dec 01 13:01:16 crc kubenswrapper[4958]: I1201 13:01:16.403598 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dc8zs" podStartSLOduration=2.683630053 podStartE2EDuration="6.40357798s" podCreationTimestamp="2025-12-01 13:01:10 +0000 UTC" firstStartedPulling="2025-12-01 13:01:12.135647032 +0000 UTC m=+10919.644436079" lastFinishedPulling="2025-12-01 13:01:15.855594979 +0000 UTC m=+10923.364384006" observedRunningTime="2025-12-01 13:01:16.403486098 +0000 UTC m=+10923.912275145" watchObservedRunningTime="2025-12-01 13:01:16.40357798 +0000 UTC m=+10923.912367027" Dec 01 13:01:20 crc kubenswrapper[4958]: I1201 13:01:20.891962 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:20 crc kubenswrapper[4958]: I1201 13:01:20.892543 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:20 crc kubenswrapper[4958]: I1201 13:01:20.963748 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:21 crc kubenswrapper[4958]: I1201 13:01:21.514178 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:21 crc kubenswrapper[4958]: I1201 13:01:21.580971 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc8zs"] Dec 01 13:01:23 crc kubenswrapper[4958]: I1201 13:01:23.467827 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dc8zs" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="registry-server" containerID="cri-o://c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10" gracePeriod=2 Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.034742 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.108464 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-catalog-content\") pod \"9f6772b6-fcad-42e2-8473-416359809312\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.108820 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzlm9\" (UniqueName: \"kubernetes.io/projected/9f6772b6-fcad-42e2-8473-416359809312-kube-api-access-mzlm9\") pod \"9f6772b6-fcad-42e2-8473-416359809312\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.108875 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-utilities\") pod \"9f6772b6-fcad-42e2-8473-416359809312\" (UID: \"9f6772b6-fcad-42e2-8473-416359809312\") " Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.109921 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-utilities" (OuterVolumeSpecName: "utilities") pod "9f6772b6-fcad-42e2-8473-416359809312" (UID: "9f6772b6-fcad-42e2-8473-416359809312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.115918 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6772b6-fcad-42e2-8473-416359809312-kube-api-access-mzlm9" (OuterVolumeSpecName: "kube-api-access-mzlm9") pod "9f6772b6-fcad-42e2-8473-416359809312" (UID: "9f6772b6-fcad-42e2-8473-416359809312"). InnerVolumeSpecName "kube-api-access-mzlm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.211169 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzlm9\" (UniqueName: \"kubernetes.io/projected/9f6772b6-fcad-42e2-8473-416359809312-kube-api-access-mzlm9\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.211204 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.309527 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f6772b6-fcad-42e2-8473-416359809312" (UID: "9f6772b6-fcad-42e2-8473-416359809312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.312392 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6772b6-fcad-42e2-8473-416359809312-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.485428 4958 generic.go:334] "Generic (PLEG): container finished" podID="9f6772b6-fcad-42e2-8473-416359809312" containerID="c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10" exitCode=0 Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.485487 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerDied","Data":"c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10"} Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.485525 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc8zs" event={"ID":"9f6772b6-fcad-42e2-8473-416359809312","Type":"ContainerDied","Data":"1baf49babca88319135c37170f5c0f42902a49271417406e2e95c93c02484452"} Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.485547 4958 scope.go:117] "RemoveContainer" containerID="c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.485736 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc8zs" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.522090 4958 scope.go:117] "RemoveContainer" containerID="5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.577917 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc8zs"] Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.605251 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dc8zs"] Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.628062 4958 scope.go:117] "RemoveContainer" containerID="a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.653812 4958 scope.go:117] "RemoveContainer" containerID="c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10" Dec 01 13:01:24 crc kubenswrapper[4958]: E1201 13:01:24.654277 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10\": container with ID starting with c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10 not found: ID does not exist" containerID="c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.654309 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10"} err="failed to get container status \"c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10\": rpc error: code = NotFound desc = could not find container \"c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10\": container with ID starting with c2cfb39c7e9bdcba3bba6e45b7f1d890f7f6545dfeddd657b9a80b6dd53ccb10 not found: ID does not exist" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.654328 4958 scope.go:117] "RemoveContainer" containerID="5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4" Dec 01 13:01:24 crc kubenswrapper[4958]: E1201 13:01:24.654634 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4\": container with ID starting with 5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4 not found: ID does not exist" containerID="5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.654654 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4"} err="failed to get container status \"5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4\": rpc error: code = NotFound desc = could not find container \"5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4\": container with ID starting with 5ae1e293967b2bbaec5f40976b4102b2ed76c5bb94b93605d844f572c77e40d4 not found: ID does not exist" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.654666 4958 scope.go:117] "RemoveContainer" containerID="a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9" Dec 01 13:01:24 crc kubenswrapper[4958]: E1201 13:01:24.654974 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9\": container with ID starting with a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9 not found: ID does not exist" containerID="a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9" Dec 01 13:01:24 crc kubenswrapper[4958]: I1201 13:01:24.654996 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9"} err="failed to get container status \"a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9\": rpc error: code = NotFound desc = could not find container \"a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9\": container with ID starting with a73fc14ca9485272d28ff560c5ce00cad9e7bee05ab2e6bb57e5f83e2970ebc9 not found: ID does not exist" Dec 01 13:01:25 crc kubenswrapper[4958]: I1201 13:01:25.809066 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6772b6-fcad-42e2-8473-416359809312" path="/var/lib/kubelet/pods/9f6772b6-fcad-42e2-8473-416359809312/volumes" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.885297 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6kjmt"] Dec 01 13:02:27 crc kubenswrapper[4958]: E1201 13:02:27.886898 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="extract-utilities" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.886929 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="extract-utilities" Dec 01 13:02:27 crc kubenswrapper[4958]: E1201 13:02:27.886967 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="registry-server" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.886980 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="registry-server" Dec 01 13:02:27 crc kubenswrapper[4958]: E1201 13:02:27.887009 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="extract-content" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.887024 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="extract-content" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.887517 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6772b6-fcad-42e2-8473-416359809312" containerName="registry-server" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.891337 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.914021 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kjmt"] Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.953476 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9z67\" (UniqueName: \"kubernetes.io/projected/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-kube-api-access-x9z67\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.953561 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-catalog-content\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:27 crc kubenswrapper[4958]: I1201 13:02:27.953611 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-utilities\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.056112 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-utilities\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.056365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9z67\" (UniqueName: \"kubernetes.io/projected/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-kube-api-access-x9z67\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.056443 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-catalog-content\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.056663 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-utilities\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.057058 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-catalog-content\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.094937 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9z67\" (UniqueName: \"kubernetes.io/projected/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-kube-api-access-x9z67\") pod \"redhat-marketplace-6kjmt\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.210933 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.211328 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.222912 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:28 crc kubenswrapper[4958]: I1201 13:02:28.848517 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kjmt"] Dec 01 13:02:29 crc kubenswrapper[4958]: I1201 13:02:29.518898 4958 generic.go:334] "Generic (PLEG): container finished" podID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerID="816a991f85354abfb75399a517e7fa7873a531306174ef34400048acf936f249" exitCode=0 Dec 01 13:02:29 crc kubenswrapper[4958]: I1201 13:02:29.519026 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kjmt" event={"ID":"c66018a7-d3e1-458b-b0ac-8b3ff64b8912","Type":"ContainerDied","Data":"816a991f85354abfb75399a517e7fa7873a531306174ef34400048acf936f249"} Dec 01 13:02:29 crc kubenswrapper[4958]: I1201 13:02:29.519193 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kjmt" event={"ID":"c66018a7-d3e1-458b-b0ac-8b3ff64b8912","Type":"ContainerStarted","Data":"8703d598e5ae5cd90f609df152f65a4427954c00daa5db5aac54326c42571aa6"} Dec 01 13:02:30 crc kubenswrapper[4958]: I1201 13:02:30.699054 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_59f9a43a-9d58-4efb-ad19-9d1e4b632fdc/init-config-reloader/0.log" Dec 01 13:02:30 crc kubenswrapper[4958]: I1201 13:02:30.929597 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_59f9a43a-9d58-4efb-ad19-9d1e4b632fdc/init-config-reloader/0.log" Dec 01 13:02:30 crc kubenswrapper[4958]: I1201 13:02:30.955139 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_59f9a43a-9d58-4efb-ad19-9d1e4b632fdc/alertmanager/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.000290 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_59f9a43a-9d58-4efb-ad19-9d1e4b632fdc/config-reloader/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.152320 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_920335c0-76db-45f4-abb1-4bbcdff5e47d/aodh-api/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.203505 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_920335c0-76db-45f4-abb1-4bbcdff5e47d/aodh-listener/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.339803 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_920335c0-76db-45f4-abb1-4bbcdff5e47d/aodh-evaluator/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.362816 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_920335c0-76db-45f4-abb1-4bbcdff5e47d/aodh-notifier/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.430053 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7785bcc974-dndp9_a261ecd4-5929-49b5-b2b5-e2d3f7e4683e/barbican-api/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.540160 4958 generic.go:334] "Generic (PLEG): container finished" podID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerID="c654bc0baabaf185267b46856eaeda4e7574844f9c295c2cea3c1ff9945cb0b3" exitCode=0 Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.540201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kjmt" event={"ID":"c66018a7-d3e1-458b-b0ac-8b3ff64b8912","Type":"ContainerDied","Data":"c654bc0baabaf185267b46856eaeda4e7574844f9c295c2cea3c1ff9945cb0b3"} Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.581585 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7785bcc974-dndp9_a261ecd4-5929-49b5-b2b5-e2d3f7e4683e/barbican-api-log/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.672653 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-644f5cb446-jk7bl_f2d1f043-186b-408d-af6c-f22c0b4e5171/barbican-keystone-listener/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.734908 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-644f5cb446-jk7bl_f2d1f043-186b-408d-af6c-f22c0b4e5171/barbican-keystone-listener-log/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.866376 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c5d7654b7-786gr_4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f/barbican-worker/0.log" Dec 01 13:02:31 crc kubenswrapper[4958]: I1201 13:02:31.916878 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c5d7654b7-786gr_4965b22a-8fb0-4ffc-8099-eadc0dfbfb1f/barbican-worker-log/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.109793 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-pw7rf_f6d924a1-e8ba-427c-bcd0-7f171676c2d2/bootstrap-openstack-openstack-cell1/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.183357 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a06735d0-753b-40ab-96cb-130eb67f9225/ceilometer-central-agent/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.223419 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a06735d0-753b-40ab-96cb-130eb67f9225/ceilometer-notification-agent/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.361074 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a06735d0-753b-40ab-96cb-130eb67f9225/proxy-httpd/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.412271 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a06735d0-753b-40ab-96cb-130eb67f9225/sg-core/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.481644 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-zmmvc_2cd94035-7ae3-4f86-995f-45f75856c2bd/ceph-client-openstack-openstack-cell1/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.773632 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_57707c75-4120-483e-bd16-ef587392f6b7/cinder-api/0.log" Dec 01 13:02:32 crc kubenswrapper[4958]: I1201 13:02:32.803621 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_57707c75-4120-483e-bd16-ef587392f6b7/cinder-api-log/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.065430 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9f555a33-3448-4f8a-be75-3bbfc99cbf02/cinder-backup/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.072217 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_9f555a33-3448-4f8a-be75-3bbfc99cbf02/probe/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.148860 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_116efa6e-e426-4550-a0e5-c36fbc3f4198/cinder-scheduler/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.544952 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_98090413-9736-47f9-9192-06174c47c3b0/cinder-volume/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.570862 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kjmt" event={"ID":"c66018a7-d3e1-458b-b0ac-8b3ff64b8912","Type":"ContainerStarted","Data":"7ded436f55dd016aedfdf8f789eb3e859c5f98ec57087bc2873a6a4962ef9742"} Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.572379 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_116efa6e-e426-4550-a0e5-c36fbc3f4198/probe/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.597636 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6kjmt" podStartSLOduration=3.13845809 podStartE2EDuration="6.597612263s" podCreationTimestamp="2025-12-01 13:02:27 +0000 UTC" firstStartedPulling="2025-12-01 13:02:29.520632195 +0000 UTC m=+10997.029421242" lastFinishedPulling="2025-12-01 13:02:32.979786378 +0000 UTC m=+11000.488575415" observedRunningTime="2025-12-01 13:02:33.588195378 +0000 UTC m=+11001.096984415" watchObservedRunningTime="2025-12-01 13:02:33.597612263 +0000 UTC m=+11001.106401300" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.599740 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_98090413-9736-47f9-9192-06174c47c3b0/probe/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.938015 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-7gfz2_d2fa2a04-6d00-4af5-b501-4e3cc1d47544/configure-os-openstack-openstack-cell1/0.log" Dec 01 13:02:33 crc kubenswrapper[4958]: I1201 13:02:33.956215 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-gn87g_a9539b9c-0548-46ae-9c62-234f5bf45564/configure-network-openstack-openstack-cell1/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.208521 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69944d679-t6rxn_e8ee90ac-8bc9-4ec5-a679-c12313b61f96/init/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.433762 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69944d679-t6rxn_e8ee90ac-8bc9-4ec5-a679-c12313b61f96/init/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.504395 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69944d679-t6rxn_e8ee90ac-8bc9-4ec5-a679-c12313b61f96/dnsmasq-dns/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.550020 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-wr775_2785029f-0dd7-4be8-8e87-e362c8cc7b09/download-cache-openstack-openstack-cell1/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.747367 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fe8e1679-6c0a-43a4-9414-8232112e7612/glance-httpd/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.773504 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fe8e1679-6c0a-43a4-9414-8232112e7612/glance-log/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.848574 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3092940e-3b67-4013-834a-dcdb2865678f/glance-httpd/0.log" Dec 01 13:02:34 crc kubenswrapper[4958]: I1201 13:02:34.913378 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3092940e-3b67-4013-834a-dcdb2865678f/glance-log/0.log" Dec 01 13:02:35 crc kubenswrapper[4958]: I1201 13:02:35.094614 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5b6459cc5d-4vd4c_5293b4ea-1edc-408f-b345-1e342d98f9cb/heat-api/0.log" Dec 01 13:02:35 crc kubenswrapper[4958]: I1201 13:02:35.256879 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-78fb45d99c-fzz4f_c402ab81-b8b7-46e3-8279-0cb38d1a1cb6/heat-cfnapi/0.log" Dec 01 13:02:35 crc kubenswrapper[4958]: I1201 13:02:35.269568 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-bcdcd649d-27jl2_6e7fe406-6242-473c-b7c5-cc1a32a276dd/heat-engine/0.log" Dec 01 13:02:36 crc kubenswrapper[4958]: I1201 13:02:36.398655 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-rjxb4_13460e8d-e2a5-48ef-8d39-da6e900da0e5/install-certs-openstack-openstack-cell1/0.log" Dec 01 13:02:36 crc kubenswrapper[4958]: I1201 13:02:36.458693 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-65dfcfb885-6c2cp_ca004e01-200e-461c-ad14-68752150d940/horizon/0.log" Dec 01 13:02:36 crc kubenswrapper[4958]: I1201 13:02:36.568033 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-65dfcfb885-6c2cp_ca004e01-200e-461c-ad14-68752150d940/horizon-log/0.log" Dec 01 13:02:36 crc kubenswrapper[4958]: I1201 13:02:36.680585 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-tq6pc_752669c4-9b31-4db0-9e8b-27252be3fda5/install-os-openstack-openstack-cell1/0.log" Dec 01 13:02:36 crc kubenswrapper[4958]: I1201 13:02:36.925204 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409841-b7glg_e88a19a5-40db-4ed9-a8e7-73e3ca8d797f/keystone-cron/0.log" Dec 01 13:02:36 crc kubenswrapper[4958]: I1201 13:02:36.931862 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7dc7b798fb-n8x4t_30f22ff2-e8d7-42de-beb4-1e598d294527/keystone-api/0.log" Dec 01 13:02:37 crc kubenswrapper[4958]: I1201 13:02:37.054983 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409901-rf82w_2f2a8511-37e1-4f3a-9eba-b950147efe75/keystone-cron/0.log" Dec 01 13:02:37 crc kubenswrapper[4958]: I1201 13:02:37.159874 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_bf17853a-4517-4e36-98b1-8d69f5c94af3/kube-state-metrics/0.log" Dec 01 13:02:37 crc kubenswrapper[4958]: I1201 13:02:37.367226 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-lp72z_5411c83e-afd4-4c5b-a751-13fb79850c06/libvirt-openstack-openstack-cell1/0.log" Dec 01 13:02:37 crc kubenswrapper[4958]: I1201 13:02:37.515054 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7/manila-api-log/0.log" Dec 01 13:02:37 crc kubenswrapper[4958]: I1201 13:02:37.621578 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a0241e7b-f0bd-4b9e-9d82-edac7f5f70e7/manila-api/0.log" Dec 01 13:02:37 crc kubenswrapper[4958]: I1201 13:02:37.665383 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_94742f94-808d-4028-b0b9-8ee30e690e1f/manila-scheduler/0.log" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.223324 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.223438 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.339694 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.468088 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_94742f94-808d-4028-b0b9-8ee30e690e1f/probe/0.log" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.582521 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2fb5bcc8-e1bd-4988-919d-d2dc05ed006b/probe/0.log" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.598294 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2fb5bcc8-e1bd-4988-919d-d2dc05ed006b/manila-share/0.log" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.691471 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.748393 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kjmt"] Dec 01 13:02:38 crc kubenswrapper[4958]: I1201 13:02:38.899545 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-645fd5bd8c-6tbc4_a111ae96-b43a-475d-9364-31049f6ab7fc/neutron-api/0.log" Dec 01 13:02:39 crc kubenswrapper[4958]: I1201 13:02:39.022645 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-645fd5bd8c-6tbc4_a111ae96-b43a-475d-9364-31049f6ab7fc/neutron-httpd/0.log" Dec 01 13:02:39 crc kubenswrapper[4958]: I1201 13:02:39.420107 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-d79c5_979164fd-9f74-4647-b801-5134de28d7f4/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 01 13:02:39 crc kubenswrapper[4958]: I1201 13:02:39.568365 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-9g5fl_a69fcd96-bb79-447e-aafb-40a30d9c0f1c/neutron-metadata-openstack-openstack-cell1/0.log" Dec 01 13:02:39 crc kubenswrapper[4958]: I1201 13:02:39.728522 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-d49jb_a688b48f-67dc-4fce-a92f-f7bf69d1ad29/neutron-sriov-openstack-openstack-cell1/0.log" Dec 01 13:02:39 crc kubenswrapper[4958]: I1201 13:02:39.846141 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_586e9523-0e51-4514-8e52-5a9877ed0c21/nova-api-api/0.log" Dec 01 13:02:39 crc kubenswrapper[4958]: I1201 13:02:39.962615 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_586e9523-0e51-4514-8e52-5a9877ed0c21/nova-api-log/0.log" Dec 01 13:02:40 crc kubenswrapper[4958]: I1201 13:02:40.172031 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_acb1633c-4acc-4286-8bdb-465623aea592/nova-cell0-conductor-conductor/0.log" Dec 01 13:02:40 crc kubenswrapper[4958]: I1201 13:02:40.338631 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_038077d4-978f-40be-8216-70603ae81f5d/nova-cell1-conductor-conductor/0.log" Dec 01 13:02:40 crc kubenswrapper[4958]: I1201 13:02:40.517209 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8a49ee43-f7d8-41b0-b8e6-273f3acdba79/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 13:02:40 crc kubenswrapper[4958]: I1201 13:02:40.671358 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6kjmt" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="registry-server" containerID="cri-o://7ded436f55dd016aedfdf8f789eb3e859c5f98ec57087bc2873a6a4962ef9742" gracePeriod=2 Dec 01 13:02:40 crc kubenswrapper[4958]: I1201 13:02:40.834319 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellk5gth_dd221c10-e588-4b0e-9b5e-94735a063bac/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.104666 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-vqmp4_d53c43c2-4919-4a0c-a4a9-477cd52cc3fb/nova-cell1-openstack-openstack-cell1/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.178056 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e9d41cd6-c1b4-4b68-93a8-03fc7f446adb/nova-metadata-log/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.242279 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e9d41cd6-c1b4-4b68-93a8-03fc7f446adb/nova-metadata-metadata/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.523683 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-f54b64c4-kbt49_feeff732-b940-4c31-963e-08785067df77/init/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.637080 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_333c9290-821d-464e-b587-8c4253eeb732/nova-scheduler-scheduler/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.651122 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-f54b64c4-kbt49_feeff732-b940-4c31-963e-08785067df77/init/0.log" Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.700026 4958 generic.go:334] "Generic (PLEG): container finished" podID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerID="7ded436f55dd016aedfdf8f789eb3e859c5f98ec57087bc2873a6a4962ef9742" exitCode=0 Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.700060 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kjmt" event={"ID":"c66018a7-d3e1-458b-b0ac-8b3ff64b8912","Type":"ContainerDied","Data":"7ded436f55dd016aedfdf8f789eb3e859c5f98ec57087bc2873a6a4962ef9742"} Dec 01 13:02:41 crc kubenswrapper[4958]: I1201 13:02:41.840569 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-f54b64c4-kbt49_feeff732-b940-4c31-963e-08785067df77/octavia-api-provider-agent/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.040104 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-dm5mf_6f968b15-117f-4035-99ce-d60cb09d1221/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.062824 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.073296 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-f54b64c4-kbt49_feeff732-b940-4c31-963e-08785067df77/octavia-api/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.216252 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-dm5mf_6f968b15-117f-4035-99ce-d60cb09d1221/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.221307 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-catalog-content\") pod \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.221442 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9z67\" (UniqueName: \"kubernetes.io/projected/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-kube-api-access-x9z67\") pod \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.221485 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-utilities\") pod \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\" (UID: \"c66018a7-d3e1-458b-b0ac-8b3ff64b8912\") " Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.223178 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-utilities" (OuterVolumeSpecName: "utilities") pod "c66018a7-d3e1-458b-b0ac-8b3ff64b8912" (UID: "c66018a7-d3e1-458b-b0ac-8b3ff64b8912"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.243767 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-kube-api-access-x9z67" (OuterVolumeSpecName: "kube-api-access-x9z67") pod "c66018a7-d3e1-458b-b0ac-8b3ff64b8912" (UID: "c66018a7-d3e1-458b-b0ac-8b3ff64b8912"). InnerVolumeSpecName "kube-api-access-x9z67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.276221 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c66018a7-d3e1-458b-b0ac-8b3ff64b8912" (UID: "c66018a7-d3e1-458b-b0ac-8b3ff64b8912"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.323786 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9z67\" (UniqueName: \"kubernetes.io/projected/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-kube-api-access-x9z67\") on node \"crc\" DevicePath \"\"" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.323823 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.323834 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66018a7-d3e1-458b-b0ac-8b3ff64b8912-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.336954 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ppg8n_21133df5-8822-4d92-9f6f-a22b455f5288/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.363760 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-dm5mf_6f968b15-117f-4035-99ce-d60cb09d1221/octavia-healthmanager/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.511318 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ppg8n_21133df5-8822-4d92-9f6f-a22b455f5288/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.591405 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-bldjw_5f3052f9-acb3-4c7e-b98d-773b0509ccb7/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.671883 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-ppg8n_21133df5-8822-4d92-9f6f-a22b455f5288/octavia-housekeeping/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.714887 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kjmt" event={"ID":"c66018a7-d3e1-458b-b0ac-8b3ff64b8912","Type":"ContainerDied","Data":"8703d598e5ae5cd90f609df152f65a4427954c00daa5db5aac54326c42571aa6"} Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.714966 4958 scope.go:117] "RemoveContainer" containerID="7ded436f55dd016aedfdf8f789eb3e859c5f98ec57087bc2873a6a4962ef9742" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.715131 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kjmt" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.751451 4958 scope.go:117] "RemoveContainer" containerID="c654bc0baabaf185267b46856eaeda4e7574844f9c295c2cea3c1ff9945cb0b3" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.762492 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kjmt"] Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.775638 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kjmt"] Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.779110 4958 scope.go:117] "RemoveContainer" containerID="816a991f85354abfb75399a517e7fa7873a531306174ef34400048acf936f249" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.894312 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-bldjw_5f3052f9-acb3-4c7e-b98d-773b0509ccb7/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.916052 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-qcdqx_502d038a-83a5-42c6-b031-ec7f8f29de9d/init/0.log" Dec 01 13:02:42 crc kubenswrapper[4958]: I1201 13:02:42.962328 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-bldjw_5f3052f9-acb3-4c7e-b98d-773b0509ccb7/octavia-rsyslog/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.175609 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-qcdqx_502d038a-83a5-42c6-b031-ec7f8f29de9d/init/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.297122 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-qcdqx_502d038a-83a5-42c6-b031-ec7f8f29de9d/octavia-worker/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.342286 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78ca54d7-1a20-4f31-9f23-a1febc785c3d/mysql-bootstrap/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.506184 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78ca54d7-1a20-4f31-9f23-a1febc785c3d/mysql-bootstrap/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.540545 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78ca54d7-1a20-4f31-9f23-a1febc785c3d/galera/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.663855 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05601e59-50a7-45b6-a41d-d872a023490b/mysql-bootstrap/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.823609 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" path="/var/lib/kubelet/pods/c66018a7-d3e1-458b-b0ac-8b3ff64b8912/volumes" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.845491 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05601e59-50a7-45b6-a41d-d872a023490b/mysql-bootstrap/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.947807 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2d7ca993-e5bd-450d-aa84-09be723e1764/openstackclient/0.log" Dec 01 13:02:43 crc kubenswrapper[4958]: I1201 13:02:43.965984 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_05601e59-50a7-45b6-a41d-d872a023490b/galera/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.187452 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fnvkx_2fca0010-c346-4525-8552-489b7bfc0942/ovn-controller/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.267833 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rzhwg_d9ded0be-0f8e-4c43-9a78-25bc81069adb/openstack-network-exporter/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.396959 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzs42_68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce/ovsdb-server-init/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.572867 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzs42_68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce/ovsdb-server/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.664522 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzs42_68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce/ovs-vswitchd/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.692156 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dzs42_68a40eb9-b08b-46ae-b1ec-ae7f1a1801ce/ovsdb-server-init/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.794469 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_644cabd8-e449-4d33-96ea-8d76b59b0157/openstack-network-exporter/0.log" Dec 01 13:02:44 crc kubenswrapper[4958]: I1201 13:02:44.896443 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_644cabd8-e449-4d33-96ea-8d76b59b0157/ovn-northd/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.063291 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-s5btl_b4388057-4ba6-4c0b-a333-e9e95768cea6/ovn-openstack-openstack-cell1/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.317735 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c5035735-49bb-48eb-b42c-6f51d31d98d0/openstack-network-exporter/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.393239 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c5035735-49bb-48eb-b42c-6f51d31d98d0/ovsdbserver-nb/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.569563 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_28a9e245-2ba8-4e4a-bf85-a969c65b6556/ovsdbserver-nb/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.572081 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_28a9e245-2ba8-4e4a-bf85-a969c65b6556/openstack-network-exporter/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.859458 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_2f235039-02d7-440f-97a5-6c5d8a68089e/ovsdbserver-nb/0.log" Dec 01 13:02:45 crc kubenswrapper[4958]: I1201 13:02:45.887419 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_2f235039-02d7-440f-97a5-6c5d8a68089e/openstack-network-exporter/0.log" Dec 01 13:02:46 crc kubenswrapper[4958]: I1201 13:02:46.012064 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc/openstack-network-exporter/0.log" Dec 01 13:02:46 crc kubenswrapper[4958]: I1201 13:02:46.107579 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_579c7d02-ce33-4b91-b42a-a5d6d8bdf3cc/ovsdbserver-sb/0.log" Dec 01 13:02:46 crc kubenswrapper[4958]: I1201 13:02:46.192566 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_4522d0c9-4a90-4d92-8860-cd95a158c3a6/openstack-network-exporter/0.log" Dec 01 13:02:46 crc kubenswrapper[4958]: I1201 13:02:46.796540 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_4522d0c9-4a90-4d92-8860-cd95a158c3a6/ovsdbserver-sb/0.log" Dec 01 13:02:46 crc kubenswrapper[4958]: I1201 13:02:46.818005 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_66f03825-e8ba-4c46-bd51-f84d32938fa9/ovsdbserver-sb/0.log" Dec 01 13:02:46 crc kubenswrapper[4958]: I1201 13:02:46.854944 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_66f03825-e8ba-4c46-bd51-f84d32938fa9/openstack-network-exporter/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.159859 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599f975964-fzz2m_077b5544-ef34-459b-9323-85996b4dbb12/placement-api/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.174758 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599f975964-fzz2m_077b5544-ef34-459b-9323-85996b4dbb12/placement-log/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.286128 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cchb6f_62cd4241-dbd3-4dfd-a141-78fc72e8b7a4/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.392199 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_774c8c4e-7ee4-4fb0-ad55-0894fa086b49/init-config-reloader/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.649071 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_774c8c4e-7ee4-4fb0-ad55-0894fa086b49/config-reloader/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.649826 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_774c8c4e-7ee4-4fb0-ad55-0894fa086b49/prometheus/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.674061 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_774c8c4e-7ee4-4fb0-ad55-0894fa086b49/init-config-reloader/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.723359 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_774c8c4e-7ee4-4fb0-ad55-0894fa086b49/thanos-sidecar/0.log" Dec 01 13:02:47 crc kubenswrapper[4958]: I1201 13:02:47.861824 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6a935748-46fe-4b48-b29e-6ba5adb44822/setup-container/0.log" Dec 01 13:02:48 crc kubenswrapper[4958]: I1201 13:02:48.802722 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d74180b-48bb-4694-bcb0-4ca03cf9d396/setup-container/0.log" Dec 01 13:02:48 crc kubenswrapper[4958]: I1201 13:02:48.817567 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6a935748-46fe-4b48-b29e-6ba5adb44822/setup-container/0.log" Dec 01 13:02:48 crc kubenswrapper[4958]: I1201 13:02:48.828235 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6a935748-46fe-4b48-b29e-6ba5adb44822/rabbitmq/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.100003 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d74180b-48bb-4694-bcb0-4ca03cf9d396/setup-container/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.135206 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6d74180b-48bb-4694-bcb0-4ca03cf9d396/rabbitmq/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.243676 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-zsrwt_9b84f1ca-719f-41ac-a6ba-3e88a52a5402/reboot-os-openstack-openstack-cell1/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.349975 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-8ttg8_cd3bfb4b-6143-47b7-8415-610c1fbeaeaf/run-os-openstack-openstack-cell1/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.513411 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-ddsrh_2d92337c-ea91-499a-8876-c9bc25e3bb0d/ssh-known-hosts-openstack/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.643194 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-9zz6g_bb66c499-dbc3-406f-984b-9c0a6d8c94a5/telemetry-openstack-openstack-cell1/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.824739 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-m577p_47a028df-7124-41b2-b3ed-0f25905f265b/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 01 13:02:49 crc kubenswrapper[4958]: I1201 13:02:49.924073 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-t2zft_ceb759ad-1f62-4f94-8006-7cd251b3e36a/validate-network-openstack-openstack-cell1/0.log" Dec 01 13:02:50 crc kubenswrapper[4958]: I1201 13:02:50.550976 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a058a864-a8ee-43b1-8662-688c39309094/memcached/0.log" Dec 01 13:02:58 crc kubenswrapper[4958]: I1201 13:02:58.210508 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 13:02:58 crc kubenswrapper[4958]: I1201 13:02:58.211228 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.298733 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-6ksnd_a3144616-c667-4adf-a2b7-44be4eaa5e59/kube-rbac-proxy/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.452494 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-6ksnd_a3144616-c667-4adf-a2b7-44be4eaa5e59/manager/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.485430 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748967c98-hjrzq_a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a/kube-rbac-proxy/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.690358 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748967c98-hjrzq_a19ee5ba-4286-44c9-94fe-bf1e2b7ff03a/manager/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.715549 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/util/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.902321 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/util/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.932942 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/pull/0.log" Dec 01 13:03:13 crc kubenswrapper[4958]: I1201 13:03:13.939497 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/pull/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.099934 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/util/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.121777 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/pull/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.147342 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d25a1f06da075e746f3a02533a9b54615491a1bf06d8b0f746ebf1c40fs8bhd_059f6f09-716d-4f5e-9afd-bb9328f783cc/extract/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.336878 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-4bgbw_398be6e9-f887-4123-aca7-9c8a5bc68a04/kube-rbac-proxy/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.395415 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-4bgbw_398be6e9-f887-4123-aca7-9c8a5bc68a04/manager/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.502601 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6bd966bbd4-plgqm_dd689dc1-fc2e-4285-a863-030b9f0cc647/kube-rbac-proxy/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.637474 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-zljjp_84142ebc-764a-4000-90e2-f9c6588d9b43/kube-rbac-proxy/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.675957 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6bd966bbd4-plgqm_dd689dc1-fc2e-4285-a863-030b9f0cc647/manager/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.785168 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-zljjp_84142ebc-764a-4000-90e2-f9c6588d9b43/manager/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.856970 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-v5f8w_47b8e3cf-440c-4d73-b791-2d94c06820f2/kube-rbac-proxy/0.log" Dec 01 13:03:14 crc kubenswrapper[4958]: I1201 13:03:14.906507 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-v5f8w_47b8e3cf-440c-4d73-b791-2d94c06820f2/manager/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.050141 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-577c5f6d94-cfww7_6c4b65c1-0872-4859-903b-46556eee593e/kube-rbac-proxy/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.192485 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-5ns97_5bf0bfe3-25ac-42de-9d86-afc6772c012b/kube-rbac-proxy/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.333052 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-5ns97_5bf0bfe3-25ac-42de-9d86-afc6772c012b/manager/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.371718 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-577c5f6d94-cfww7_6c4b65c1-0872-4859-903b-46556eee593e/manager/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.449176 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d6f5d799-9ln88_7b55e6b5-2b3c-4d13-b998-6c3c56210483/kube-rbac-proxy/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.672492 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d6f5d799-9ln88_7b55e6b5-2b3c-4d13-b998-6c3c56210483/manager/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.693923 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-646fd589f9-csmzl_9dad08eb-34b4-4ea4-94bb-467e5c2df8df/kube-rbac-proxy/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.750159 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-646fd589f9-csmzl_9dad08eb-34b4-4ea4-94bb-467e5c2df8df/manager/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.898947 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-xmcvh_d0c5b0b0-db70-485c-95b1-a63f980af637/kube-rbac-proxy/0.log" Dec 01 13:03:15 crc kubenswrapper[4958]: I1201 13:03:15.968806 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-xmcvh_d0c5b0b0-db70-485c-95b1-a63f980af637/manager/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.071492 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b6c55ffd5-629vw_a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c/kube-rbac-proxy/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.161517 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b6c55ffd5-629vw_a4b2002a-8ff1-4072-be81-9e4fd5bb2f1c/manager/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.224636 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-8wwrb_759e8424-5b95-49fd-a80b-cb311b441b54/kube-rbac-proxy/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.373365 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-8wwrb_759e8424-5b95-49fd-a80b-cb311b441b54/manager/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.398149 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7979c68bc7-dmmb7_de5b3e9c-1e86-4fd1-9342-df0ebcb684cb/kube-rbac-proxy/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.429770 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7979c68bc7-dmmb7_de5b3e9c-1e86-4fd1-9342-df0ebcb684cb/manager/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.496798 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-cv8pp_66f0dd1b-3fde-4e90-bfbe-55e5f67b197b/kube-rbac-proxy/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.599702 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-cv8pp_66f0dd1b-3fde-4e90-bfbe-55e5f67b197b/manager/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.641728 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bb499f6df-kl28c_227e59f0-e574-4c6e-8349-8e1d648eb5f1/kube-rbac-proxy/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.859167 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5b9cbd897-lz77k_6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b/kube-rbac-proxy/0.log" Dec 01 13:03:16 crc kubenswrapper[4958]: I1201 13:03:16.989106 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p7jbk_3433758a-1a2d-48eb-9d78-f21c8b1d9bbf/registry-server/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.044073 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5b9cbd897-lz77k_6163c2c1-4c3c-4a5b-8b6e-c09ad4b6095b/operator/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.132384 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-pkd25_3365704e-370d-4fe9-9d23-890ae1a593cc/kube-rbac-proxy/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.467673 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-pkd25_3365704e-370d-4fe9-9d23-890ae1a593cc/manager/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.476783 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-vthqj_998f917b-9023-454a-b97c-3ff7948dda3a/kube-rbac-proxy/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.592823 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-vthqj_998f917b-9023-454a-b97c-3ff7948dda3a/manager/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.757565 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4dpvl_92fdc1a0-10c5-4ff4-860c-3faef0cc6c16/operator/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.835792 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-cc9f5bc5c-xtrpg_12acdf72-847b-4b01-a76a-29e0ba1958c3/kube-rbac-proxy/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.967798 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58487d9bf4-q928n_7513d038-3642-4f27-a2df-c932dc6c9eaa/kube-rbac-proxy/0.log" Dec 01 13:03:17 crc kubenswrapper[4958]: I1201 13:03:17.990003 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-cc9f5bc5c-xtrpg_12acdf72-847b-4b01-a76a-29e0ba1958c3/manager/0.log" Dec 01 13:03:18 crc kubenswrapper[4958]: I1201 13:03:18.251138 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-77db6bf9c-cqms2_2c2c6b14-779c-4e4f-9b83-542ccf77286c/kube-rbac-proxy/0.log" Dec 01 13:03:18 crc kubenswrapper[4958]: I1201 13:03:18.262623 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-77db6bf9c-cqms2_2c2c6b14-779c-4e4f-9b83-542ccf77286c/manager/0.log" Dec 01 13:03:18 crc kubenswrapper[4958]: I1201 13:03:18.307370 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58487d9bf4-q928n_7513d038-3642-4f27-a2df-c932dc6c9eaa/manager/0.log" Dec 01 13:03:18 crc kubenswrapper[4958]: I1201 13:03:18.486142 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b56b8849f-mjksk_1c6ee3df-7b20-496f-9005-12a455dd54b9/manager/0.log" Dec 01 13:03:18 crc kubenswrapper[4958]: I1201 13:03:18.521595 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b56b8849f-mjksk_1c6ee3df-7b20-496f-9005-12a455dd54b9/kube-rbac-proxy/0.log" Dec 01 13:03:19 crc kubenswrapper[4958]: I1201 13:03:19.063014 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6bb499f6df-kl28c_227e59f0-e574-4c6e-8349-8e1d648eb5f1/manager/0.log" Dec 01 13:03:28 crc kubenswrapper[4958]: I1201 13:03:28.271545 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 13:03:28 crc kubenswrapper[4958]: I1201 13:03:28.272012 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 13:03:28 crc kubenswrapper[4958]: I1201 13:03:28.272098 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 13:03:28 crc kubenswrapper[4958]: I1201 13:03:28.272624 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cbdb8c5e4aa7ce2d4300154efb15fbdf8680bee6c2d6407e3cd25b114c4e0dd"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 13:03:28 crc kubenswrapper[4958]: I1201 13:03:28.272669 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://2cbdb8c5e4aa7ce2d4300154efb15fbdf8680bee6c2d6407e3cd25b114c4e0dd" gracePeriod=600 Dec 01 13:03:29 crc kubenswrapper[4958]: I1201 13:03:29.297083 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="2cbdb8c5e4aa7ce2d4300154efb15fbdf8680bee6c2d6407e3cd25b114c4e0dd" exitCode=0 Dec 01 13:03:29 crc kubenswrapper[4958]: I1201 13:03:29.297175 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"2cbdb8c5e4aa7ce2d4300154efb15fbdf8680bee6c2d6407e3cd25b114c4e0dd"} Dec 01 13:03:29 crc kubenswrapper[4958]: I1201 13:03:29.303207 4958 scope.go:117] "RemoveContainer" containerID="575a168731c23a10701b0d7071b4ea54b8e0e17564b43bf3eaebda4dc5b76ca0" Dec 01 13:03:29 crc kubenswrapper[4958]: I1201 13:03:29.302084 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerStarted","Data":"da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6"} Dec 01 13:03:38 crc kubenswrapper[4958]: I1201 13:03:38.237991 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-sv8bf_c7fc63ed-b5b8-4854-8d5a-1cab817a3581/control-plane-machine-set-operator/0.log" Dec 01 13:03:38 crc kubenswrapper[4958]: I1201 13:03:38.370557 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-64ctn_142b2c22-6df3-4540-8ad7-748e000f3ec9/kube-rbac-proxy/0.log" Dec 01 13:03:38 crc kubenswrapper[4958]: I1201 13:03:38.416622 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-64ctn_142b2c22-6df3-4540-8ad7-748e000f3ec9/machine-api-operator/0.log" Dec 01 13:03:52 crc kubenswrapper[4958]: I1201 13:03:52.181057 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-2z2hp_431d1004-c19f-4715-aa26-98a7c8e1e63d/cert-manager-controller/0.log" Dec 01 13:03:52 crc kubenswrapper[4958]: I1201 13:03:52.315704 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-88hgg_d8cde77f-dd6f-47cf-9978-dd91ea966f6a/cert-manager-cainjector/0.log" Dec 01 13:03:52 crc kubenswrapper[4958]: I1201 13:03:52.358534 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-26gvw_86303ba1-0515-43b8-92ec-14f433ad784c/cert-manager-webhook/0.log" Dec 01 13:04:06 crc kubenswrapper[4958]: I1201 13:04:06.462520 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-j4v58_34a50f5b-9514-4292-bbc6-ed76bafccf22/nmstate-console-plugin/0.log" Dec 01 13:04:06 crc kubenswrapper[4958]: I1201 13:04:06.637249 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-csd4v_dc34ddf1-ab69-497f-a2e9-8ba0a328bc4a/nmstate-handler/0.log" Dec 01 13:04:06 crc kubenswrapper[4958]: I1201 13:04:06.687997 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bww98_a721f6d8-7d75-4fc8-9cc5-930e5764c7f6/kube-rbac-proxy/0.log" Dec 01 13:04:06 crc kubenswrapper[4958]: I1201 13:04:06.747318 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bww98_a721f6d8-7d75-4fc8-9cc5-930e5764c7f6/nmstate-metrics/0.log" Dec 01 13:04:06 crc kubenswrapper[4958]: I1201 13:04:06.897655 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-59st7_a08d88ea-e101-479e-bfc2-9b4933ef0319/nmstate-operator/0.log" Dec 01 13:04:06 crc kubenswrapper[4958]: I1201 13:04:06.956305 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-78pgf_8ffef51c-a736-4393-8e14-ba5441744873/nmstate-webhook/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.275165 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vmp79_7996247d-b6a6-4433-95e8-21490189c1dd/kube-rbac-proxy/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.531002 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-frr-files/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.675966 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vmp79_7996247d-b6a6-4433-95e8-21490189c1dd/controller/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.733147 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-frr-files/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.755790 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-reloader/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.764776 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-metrics/0.log" Dec 01 13:04:24 crc kubenswrapper[4958]: I1201 13:04:24.863915 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-reloader/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.047806 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-frr-files/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.058598 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-reloader/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.094664 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-metrics/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.120395 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-metrics/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.349668 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-frr-files/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.357509 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-reloader/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.379956 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/cp-metrics/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.447939 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/controller/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.548797 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/frr-metrics/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.652609 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/kube-rbac-proxy/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.729832 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/kube-rbac-proxy-frr/0.log" Dec 01 13:04:25 crc kubenswrapper[4958]: I1201 13:04:25.903952 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/reloader/0.log" Dec 01 13:04:26 crc kubenswrapper[4958]: I1201 13:04:26.055963 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-clzbq_925d86c1-9ff9-4290-a32b-142cdd162300/frr-k8s-webhook-server/0.log" Dec 01 13:04:26 crc kubenswrapper[4958]: I1201 13:04:26.266301 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-68d799f659-7xjvd_1dc14960-b720-4b96-9635-b0ee7f2d10c4/manager/0.log" Dec 01 13:04:26 crc kubenswrapper[4958]: I1201 13:04:26.393613 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-67ddcd596b-s74bc_a36305cc-8703-4b92-8632-bf39b869d4e1/webhook-server/0.log" Dec 01 13:04:26 crc kubenswrapper[4958]: I1201 13:04:26.526503 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cd79p_48822f3f-9799-45f5-9551-e2ac6966c560/kube-rbac-proxy/0.log" Dec 01 13:04:27 crc kubenswrapper[4958]: I1201 13:04:27.590612 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cd79p_48822f3f-9799-45f5-9551-e2ac6966c560/speaker/0.log" Dec 01 13:04:29 crc kubenswrapper[4958]: I1201 13:04:29.265503 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5gghz_4deb5735-ce1d-4302-9dad-783a64a13380/frr/0.log" Dec 01 13:04:42 crc kubenswrapper[4958]: I1201 13:04:42.591250 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/util/0.log" Dec 01 13:04:42 crc kubenswrapper[4958]: I1201 13:04:42.803058 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/util/0.log" Dec 01 13:04:42 crc kubenswrapper[4958]: I1201 13:04:42.966446 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/pull/0.log" Dec 01 13:04:42 crc kubenswrapper[4958]: I1201 13:04:42.968083 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/pull/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.024425 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/pull/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.125084 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/util/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.197329 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anjj4q_9e520358-0182-4324-865d-7bc6a261c2e4/extract/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.240477 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/util/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.412396 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/pull/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.412642 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/pull/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.437703 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/util/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.605303 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/util/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.620143 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/pull/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.656807 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftnvr5_d21f6ab4-e618-4de3-b662-bad657e0dd96/extract/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.827965 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/util/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.929366 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/util/0.log" Dec 01 13:04:43 crc kubenswrapper[4958]: I1201 13:04:43.971724 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/pull/0.log" Dec 01 13:04:44 crc kubenswrapper[4958]: I1201 13:04:44.008973 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/pull/0.log" Dec 01 13:04:44 crc kubenswrapper[4958]: I1201 13:04:44.810333 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/util/0.log" Dec 01 13:04:44 crc kubenswrapper[4958]: I1201 13:04:44.844978 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/extract/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.163531 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ssfz_b3c490fa-b355-4f5c-82e4-5df3975a736c/pull/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.218049 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/util/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.460829 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/util/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.489734 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/pull/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.511478 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/pull/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.767615 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/pull/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.789877 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/util/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.839616 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83q6qg7_6fe5b1e0-0e6a-407c-aa29-ca19874c86fd/extract/0.log" Dec 01 13:04:45 crc kubenswrapper[4958]: I1201 13:04:45.939204 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/extract-utilities/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.115250 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/extract-utilities/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.169743 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/extract-content/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.183461 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/extract-content/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.337762 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/extract-utilities/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.357308 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/extract-content/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.447599 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/extract-utilities/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.636620 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/extract-utilities/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.711480 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/extract-content/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.713286 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/extract-content/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.976463 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/extract-content/0.log" Dec 01 13:04:46 crc kubenswrapper[4958]: I1201 13:04:46.979210 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/extract-utilities/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.325271 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-c99nd_905b7c43-8ebe-4cd3-a076-ff77e791de7b/marketplace-operator/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.458353 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/extract-utilities/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.690364 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lcsxv_cd34755b-2dcb-4ca5-a680-0fffbf319417/registry-server/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.699140 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/extract-content/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.728542 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/extract-content/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.750495 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/extract-utilities/0.log" Dec 01 13:04:47 crc kubenswrapper[4958]: I1201 13:04:47.983785 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/extract-utilities/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.008835 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/extract-content/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.205186 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/extract-utilities/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.417744 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/extract-content/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.436765 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/extract-utilities/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.493730 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/extract-content/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.636247 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mmqlv_525654b6-b62b-46d0-88d0-2a94cc23278c/registry-server/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.682295 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dr9h9_888fe73a-98b3-4b4e-891e-9d75f26c64af/registry-server/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.751833 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/extract-content/0.log" Dec 01 13:04:48 crc kubenswrapper[4958]: I1201 13:04:48.755963 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/extract-utilities/0.log" Dec 01 13:04:49 crc kubenswrapper[4958]: I1201 13:04:49.884010 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6khx_867dd6fb-7abb-49cd-8c70-123705dd751b/registry-server/0.log" Dec 01 13:05:03 crc kubenswrapper[4958]: I1201 13:05:03.679888 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-bbtwb_56cf7323-f1e0-4c22-9890-aa06bf839456/prometheus-operator/0.log" Dec 01 13:05:03 crc kubenswrapper[4958]: I1201 13:05:03.823605 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7749c4d558-jh8qb_3ecfd6a8-cdee-4d7a-8a4b-9158b7d91b05/prometheus-operator-admission-webhook/0.log" Dec 01 13:05:04 crc kubenswrapper[4958]: I1201 13:05:04.552056 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7749c4d558-rmt4r_4a3a2039-e66b-4c1f-b95e-0665f82fdd0b/prometheus-operator-admission-webhook/0.log" Dec 01 13:05:04 crc kubenswrapper[4958]: I1201 13:05:04.652527 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-qqt7d_84673714-0a83-4e97-bbfb-bd5bf7716f71/operator/0.log" Dec 01 13:05:04 crc kubenswrapper[4958]: I1201 13:05:04.909909 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-wddt7_e1214a3e-b12c-4cf4-8230-8fb09334635e/perses-operator/0.log" Dec 01 13:05:34 crc kubenswrapper[4958]: I1201 13:05:34.270412 4958 scope.go:117] "RemoveContainer" containerID="b936085f60b23773d8dca94b49ebfdbaa902ba5e5ecfc96dfc8960a529ad97b8" Dec 01 13:05:58 crc kubenswrapper[4958]: I1201 13:05:58.210645 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 13:05:58 crc kubenswrapper[4958]: I1201 13:05:58.211215 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 13:06:28 crc kubenswrapper[4958]: I1201 13:06:28.210968 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 13:06:28 crc kubenswrapper[4958]: I1201 13:06:28.211562 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 13:06:58 crc kubenswrapper[4958]: I1201 13:06:58.210272 4958 patch_prober.go:28] interesting pod/machine-config-daemon-prmw7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 13:06:58 crc kubenswrapper[4958]: I1201 13:06:58.210796 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 13:06:58 crc kubenswrapper[4958]: I1201 13:06:58.210870 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" Dec 01 13:06:58 crc kubenswrapper[4958]: I1201 13:06:58.211750 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6"} pod="openshift-machine-config-operator/machine-config-daemon-prmw7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 13:06:58 crc kubenswrapper[4958]: I1201 13:06:58.211821 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerName="machine-config-daemon" containerID="cri-o://da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" gracePeriod=600 Dec 01 13:06:58 crc kubenswrapper[4958]: E1201 13:06:58.938755 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:06:59 crc kubenswrapper[4958]: I1201 13:06:59.030531 4958 generic.go:334] "Generic (PLEG): container finished" podID="09a41414-b5bf-481a-afdc-b0042f4c78b0" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" exitCode=0 Dec 01 13:06:59 crc kubenswrapper[4958]: I1201 13:06:59.030586 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" event={"ID":"09a41414-b5bf-481a-afdc-b0042f4c78b0","Type":"ContainerDied","Data":"da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6"} Dec 01 13:06:59 crc kubenswrapper[4958]: I1201 13:06:59.030622 4958 scope.go:117] "RemoveContainer" containerID="2cbdb8c5e4aa7ce2d4300154efb15fbdf8680bee6c2d6407e3cd25b114c4e0dd" Dec 01 13:06:59 crc kubenswrapper[4958]: I1201 13:06:59.031673 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:06:59 crc kubenswrapper[4958]: E1201 13:06:59.032080 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:07:11 crc kubenswrapper[4958]: I1201 13:07:11.798285 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:07:11 crc kubenswrapper[4958]: E1201 13:07:11.798993 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:07:25 crc kubenswrapper[4958]: I1201 13:07:25.798554 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:07:25 crc kubenswrapper[4958]: E1201 13:07:25.799332 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:07:28 crc kubenswrapper[4958]: I1201 13:07:28.394782 4958 generic.go:334] "Generic (PLEG): container finished" podID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerID="586e5f3ecba9851f074851b86405cc99c828f89689625426fc5d707449707117" exitCode=0 Dec 01 13:07:28 crc kubenswrapper[4958]: I1201 13:07:28.394835 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-st52w/must-gather-99vcj" event={"ID":"952cd42a-be46-45c1-93d6-f5dc8469cf34","Type":"ContainerDied","Data":"586e5f3ecba9851f074851b86405cc99c828f89689625426fc5d707449707117"} Dec 01 13:07:28 crc kubenswrapper[4958]: I1201 13:07:28.396493 4958 scope.go:117] "RemoveContainer" containerID="586e5f3ecba9851f074851b86405cc99c828f89689625426fc5d707449707117" Dec 01 13:07:29 crc kubenswrapper[4958]: I1201 13:07:29.969405 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-st52w_must-gather-99vcj_952cd42a-be46-45c1-93d6-f5dc8469cf34/gather/0.log" Dec 01 13:07:37 crc kubenswrapper[4958]: I1201 13:07:37.798042 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:07:37 crc kubenswrapper[4958]: E1201 13:07:37.798941 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:07:38 crc kubenswrapper[4958]: I1201 13:07:38.667917 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-st52w/must-gather-99vcj"] Dec 01 13:07:38 crc kubenswrapper[4958]: I1201 13:07:38.668290 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-st52w/must-gather-99vcj" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="copy" containerID="cri-o://719a64633842a588c867d8e512ad32bb91fb23f058cb79134ff949b8780f0ebf" gracePeriod=2 Dec 01 13:07:38 crc kubenswrapper[4958]: I1201 13:07:38.678614 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-st52w/must-gather-99vcj"] Dec 01 13:07:39 crc kubenswrapper[4958]: I1201 13:07:39.009769 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-st52w_must-gather-99vcj_952cd42a-be46-45c1-93d6-f5dc8469cf34/copy/0.log" Dec 01 13:07:39 crc kubenswrapper[4958]: I1201 13:07:39.010209 4958 generic.go:334] "Generic (PLEG): container finished" podID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerID="719a64633842a588c867d8e512ad32bb91fb23f058cb79134ff949b8780f0ebf" exitCode=143 Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:39.701217 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-st52w_must-gather-99vcj_952cd42a-be46-45c1-93d6-f5dc8469cf34/copy/0.log" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:39.702235 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:39.826716 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrgn2\" (UniqueName: \"kubernetes.io/projected/952cd42a-be46-45c1-93d6-f5dc8469cf34-kube-api-access-xrgn2\") pod \"952cd42a-be46-45c1-93d6-f5dc8469cf34\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:39.826929 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/952cd42a-be46-45c1-93d6-f5dc8469cf34-must-gather-output\") pod \"952cd42a-be46-45c1-93d6-f5dc8469cf34\" (UID: \"952cd42a-be46-45c1-93d6-f5dc8469cf34\") " Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:39.834328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952cd42a-be46-45c1-93d6-f5dc8469cf34-kube-api-access-xrgn2" (OuterVolumeSpecName: "kube-api-access-xrgn2") pod "952cd42a-be46-45c1-93d6-f5dc8469cf34" (UID: "952cd42a-be46-45c1-93d6-f5dc8469cf34"). InnerVolumeSpecName "kube-api-access-xrgn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:39.931761 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrgn2\" (UniqueName: \"kubernetes.io/projected/952cd42a-be46-45c1-93d6-f5dc8469cf34-kube-api-access-xrgn2\") on node \"crc\" DevicePath \"\"" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:40.026701 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-st52w_must-gather-99vcj_952cd42a-be46-45c1-93d6-f5dc8469cf34/copy/0.log" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:40.027188 4958 scope.go:117] "RemoveContainer" containerID="719a64633842a588c867d8e512ad32bb91fb23f058cb79134ff949b8780f0ebf" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:40.027355 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-st52w/must-gather-99vcj" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:40.057062 4958 scope.go:117] "RemoveContainer" containerID="586e5f3ecba9851f074851b86405cc99c828f89689625426fc5d707449707117" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:40.125930 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/952cd42a-be46-45c1-93d6-f5dc8469cf34-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "952cd42a-be46-45c1-93d6-f5dc8469cf34" (UID: "952cd42a-be46-45c1-93d6-f5dc8469cf34"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:07:40 crc kubenswrapper[4958]: I1201 13:07:40.139661 4958 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/952cd42a-be46-45c1-93d6-f5dc8469cf34-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 13:07:41 crc kubenswrapper[4958]: I1201 13:07:41.812829 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" path="/var/lib/kubelet/pods/952cd42a-be46-45c1-93d6-f5dc8469cf34/volumes" Dec 01 13:07:48 crc kubenswrapper[4958]: I1201 13:07:48.797918 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:07:48 crc kubenswrapper[4958]: E1201 13:07:48.798880 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:08:03 crc kubenswrapper[4958]: I1201 13:08:03.811138 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:08:03 crc kubenswrapper[4958]: E1201 13:08:03.812015 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:08:14 crc kubenswrapper[4958]: I1201 13:08:14.797834 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:08:14 crc kubenswrapper[4958]: E1201 13:08:14.798698 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:08:27 crc kubenswrapper[4958]: I1201 13:08:27.798014 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:08:27 crc kubenswrapper[4958]: E1201 13:08:27.799225 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:08:41 crc kubenswrapper[4958]: I1201 13:08:41.798234 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:08:41 crc kubenswrapper[4958]: E1201 13:08:41.799005 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:08:54 crc kubenswrapper[4958]: I1201 13:08:54.797437 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:08:54 crc kubenswrapper[4958]: E1201 13:08:54.798244 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:09:09 crc kubenswrapper[4958]: I1201 13:09:09.798625 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:09:09 crc kubenswrapper[4958]: E1201 13:09:09.799582 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:09:21 crc kubenswrapper[4958]: I1201 13:09:21.798244 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:09:21 crc kubenswrapper[4958]: E1201 13:09:21.799341 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:09:33 crc kubenswrapper[4958]: I1201 13:09:33.818094 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:09:33 crc kubenswrapper[4958]: E1201 13:09:33.819189 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:09:48 crc kubenswrapper[4958]: I1201 13:09:48.797963 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:09:48 crc kubenswrapper[4958]: E1201 13:09:48.799680 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.908830 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nkbpp"] Dec 01 13:09:49 crc kubenswrapper[4958]: E1201 13:09:49.909841 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="copy" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.909957 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="copy" Dec 01 13:09:49 crc kubenswrapper[4958]: E1201 13:09:49.909976 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="extract-utilities" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.909989 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="extract-utilities" Dec 01 13:09:49 crc kubenswrapper[4958]: E1201 13:09:49.910006 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="extract-content" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.910020 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="extract-content" Dec 01 13:09:49 crc kubenswrapper[4958]: E1201 13:09:49.910041 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="registry-server" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.910049 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="registry-server" Dec 01 13:09:49 crc kubenswrapper[4958]: E1201 13:09:49.910106 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="gather" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.910114 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="gather" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.910388 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="gather" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.910410 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66018a7-d3e1-458b-b0ac-8b3ff64b8912" containerName="registry-server" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.910429 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="952cd42a-be46-45c1-93d6-f5dc8469cf34" containerName="copy" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.912537 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:49 crc kubenswrapper[4958]: I1201 13:09:49.937616 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkbpp"] Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.069454 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv5gw\" (UniqueName: \"kubernetes.io/projected/e72758a0-277b-4a06-b19e-99ca611f3819-kube-api-access-wv5gw\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.069528 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-catalog-content\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.069888 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-utilities\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.171814 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-utilities\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.172467 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-utilities\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.173330 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv5gw\" (UniqueName: \"kubernetes.io/projected/e72758a0-277b-4a06-b19e-99ca611f3819-kube-api-access-wv5gw\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.173354 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-catalog-content\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.173729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-catalog-content\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.373416 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv5gw\" (UniqueName: \"kubernetes.io/projected/e72758a0-277b-4a06-b19e-99ca611f3819-kube-api-access-wv5gw\") pod \"community-operators-nkbpp\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.401730 4958 trace.go:236] Trace[662149493]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (01-Dec-2025 13:09:46.951) (total time: 3450ms): Dec 01 13:09:50 crc kubenswrapper[4958]: Trace[662149493]: [3.450212666s] [3.450212666s] END Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.482418 4958 trace.go:236] Trace[1905712635]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (01-Dec-2025 13:09:44.604) (total time: 5877ms): Dec 01 13:09:50 crc kubenswrapper[4958]: Trace[1905712635]: [5.877806288s] [5.877806288s] END Dec 01 13:09:50 crc kubenswrapper[4958]: I1201 13:09:50.564810 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:09:51 crc kubenswrapper[4958]: I1201 13:09:51.156791 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkbpp"] Dec 01 13:09:51 crc kubenswrapper[4958]: I1201 13:09:51.877955 4958 generic.go:334] "Generic (PLEG): container finished" podID="e72758a0-277b-4a06-b19e-99ca611f3819" containerID="86863cf496eced3bd11fada19d5c41fcda0be570c1165f01747ccf988ae6c7f4" exitCode=0 Dec 01 13:09:51 crc kubenswrapper[4958]: I1201 13:09:51.878099 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkbpp" event={"ID":"e72758a0-277b-4a06-b19e-99ca611f3819","Type":"ContainerDied","Data":"86863cf496eced3bd11fada19d5c41fcda0be570c1165f01747ccf988ae6c7f4"} Dec 01 13:09:51 crc kubenswrapper[4958]: I1201 13:09:51.878542 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkbpp" event={"ID":"e72758a0-277b-4a06-b19e-99ca611f3819","Type":"ContainerStarted","Data":"24571bd7c4260ddbcb2ea8bdb8960e9df6f91e6e216fdb59f7a5e9908a965f4e"} Dec 01 13:09:51 crc kubenswrapper[4958]: I1201 13:09:51.881215 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 13:09:57 crc kubenswrapper[4958]: I1201 13:09:57.950697 4958 generic.go:334] "Generic (PLEG): container finished" podID="e72758a0-277b-4a06-b19e-99ca611f3819" containerID="8c699b7007573b722809fcebd9df4ffdf35cc34dd7b8c0ccf451a60c2c4d9799" exitCode=0 Dec 01 13:09:57 crc kubenswrapper[4958]: I1201 13:09:57.950770 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkbpp" event={"ID":"e72758a0-277b-4a06-b19e-99ca611f3819","Type":"ContainerDied","Data":"8c699b7007573b722809fcebd9df4ffdf35cc34dd7b8c0ccf451a60c2c4d9799"} Dec 01 13:09:58 crc kubenswrapper[4958]: I1201 13:09:58.965394 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkbpp" event={"ID":"e72758a0-277b-4a06-b19e-99ca611f3819","Type":"ContainerStarted","Data":"a7637a8073bfb36e34730453c9aa543ebcc84de0605f918b360209c621e8b587"} Dec 01 13:09:58 crc kubenswrapper[4958]: I1201 13:09:58.991288 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nkbpp" podStartSLOduration=3.432530198 podStartE2EDuration="9.99125897s" podCreationTimestamp="2025-12-01 13:09:49 +0000 UTC" firstStartedPulling="2025-12-01 13:09:51.880978341 +0000 UTC m=+11439.389767378" lastFinishedPulling="2025-12-01 13:09:58.439707113 +0000 UTC m=+11445.948496150" observedRunningTime="2025-12-01 13:09:58.985488747 +0000 UTC m=+11446.494277794" watchObservedRunningTime="2025-12-01 13:09:58.99125897 +0000 UTC m=+11446.500048007" Dec 01 13:10:00 crc kubenswrapper[4958]: I1201 13:10:00.565503 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:10:00 crc kubenswrapper[4958]: I1201 13:10:00.565827 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:10:00 crc kubenswrapper[4958]: I1201 13:10:00.626781 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:10:03 crc kubenswrapper[4958]: I1201 13:10:03.811988 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:10:03 crc kubenswrapper[4958]: E1201 13:10:03.813287 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:10:10 crc kubenswrapper[4958]: I1201 13:10:10.651194 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:10:13 crc kubenswrapper[4958]: I1201 13:10:13.133820 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkbpp"] Dec 01 13:10:13 crc kubenswrapper[4958]: I1201 13:10:13.134663 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nkbpp" podUID="e72758a0-277b-4a06-b19e-99ca611f3819" containerName="registry-server" containerID="cri-o://a7637a8073bfb36e34730453c9aa543ebcc84de0605f918b360209c621e8b587" gracePeriod=2 Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.235812 4958 generic.go:334] "Generic (PLEG): container finished" podID="e72758a0-277b-4a06-b19e-99ca611f3819" containerID="a7637a8073bfb36e34730453c9aa543ebcc84de0605f918b360209c621e8b587" exitCode=0 Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.235969 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkbpp" event={"ID":"e72758a0-277b-4a06-b19e-99ca611f3819","Type":"ContainerDied","Data":"a7637a8073bfb36e34730453c9aa543ebcc84de0605f918b360209c621e8b587"} Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.236463 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkbpp" event={"ID":"e72758a0-277b-4a06-b19e-99ca611f3819","Type":"ContainerDied","Data":"24571bd7c4260ddbcb2ea8bdb8960e9df6f91e6e216fdb59f7a5e9908a965f4e"} Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.236480 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24571bd7c4260ddbcb2ea8bdb8960e9df6f91e6e216fdb59f7a5e9908a965f4e" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.272933 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.413864 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv5gw\" (UniqueName: \"kubernetes.io/projected/e72758a0-277b-4a06-b19e-99ca611f3819-kube-api-access-wv5gw\") pod \"e72758a0-277b-4a06-b19e-99ca611f3819\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.414027 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-utilities\") pod \"e72758a0-277b-4a06-b19e-99ca611f3819\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.414197 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-catalog-content\") pod \"e72758a0-277b-4a06-b19e-99ca611f3819\" (UID: \"e72758a0-277b-4a06-b19e-99ca611f3819\") " Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.415200 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-utilities" (OuterVolumeSpecName: "utilities") pod "e72758a0-277b-4a06-b19e-99ca611f3819" (UID: "e72758a0-277b-4a06-b19e-99ca611f3819"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.421100 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72758a0-277b-4a06-b19e-99ca611f3819-kube-api-access-wv5gw" (OuterVolumeSpecName: "kube-api-access-wv5gw") pod "e72758a0-277b-4a06-b19e-99ca611f3819" (UID: "e72758a0-277b-4a06-b19e-99ca611f3819"). InnerVolumeSpecName "kube-api-access-wv5gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.476887 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e72758a0-277b-4a06-b19e-99ca611f3819" (UID: "e72758a0-277b-4a06-b19e-99ca611f3819"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.516356 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.516395 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e72758a0-277b-4a06-b19e-99ca611f3819-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.516412 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv5gw\" (UniqueName: \"kubernetes.io/projected/e72758a0-277b-4a06-b19e-99ca611f3819-kube-api-access-wv5gw\") on node \"crc\" DevicePath \"\"" Dec 01 13:10:14 crc kubenswrapper[4958]: I1201 13:10:14.797168 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:10:14 crc kubenswrapper[4958]: E1201 13:10:14.797806 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:10:15 crc kubenswrapper[4958]: I1201 13:10:15.247009 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkbpp" Dec 01 13:10:15 crc kubenswrapper[4958]: I1201 13:10:15.300379 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkbpp"] Dec 01 13:10:15 crc kubenswrapper[4958]: I1201 13:10:15.310911 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nkbpp"] Dec 01 13:10:15 crc kubenswrapper[4958]: I1201 13:10:15.811944 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e72758a0-277b-4a06-b19e-99ca611f3819" path="/var/lib/kubelet/pods/e72758a0-277b-4a06-b19e-99ca611f3819/volumes" Dec 01 13:10:28 crc kubenswrapper[4958]: I1201 13:10:28.798040 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:10:28 crc kubenswrapper[4958]: E1201 13:10:28.798987 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0" Dec 01 13:10:41 crc kubenswrapper[4958]: I1201 13:10:41.797730 4958 scope.go:117] "RemoveContainer" containerID="da526255b89e92dd1842ccd0522bb91e47a6d42604868031c669410152fae7d6" Dec 01 13:10:41 crc kubenswrapper[4958]: E1201 13:10:41.798495 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-prmw7_openshift-machine-config-operator(09a41414-b5bf-481a-afdc-b0042f4c78b0)\"" pod="openshift-machine-config-operator/machine-config-daemon-prmw7" podUID="09a41414-b5bf-481a-afdc-b0042f4c78b0"